From 0ec9b443b880d7ccae058c6adfd2f1ca4d77e8be Mon Sep 17 00:00:00 2001 From: xucz Date: Fri, 23 Feb 2024 11:45:12 +0800 Subject: [PATCH] Dev/4.3.0 (#365) * update code style * [Android]add custom stream encrypt case. * [Android]update README links. * Dev/oc test (#353) * ci adapting objective-c * add objective-c APIExample project * add export ipa * modify ci script * modify OC path * .. * .. * update SDK version to 4.2.2 * .. * fix rtmp push bug * fix ios 12.0 version runing bug * fix fusion cdn ui bug * fix fusion cdn change rtc bug * fix fusion cdn ui bug * add OC Moudle README * modify README --------- Co-authored-by: zhaoyongqiang * Dev/4.2.3 (#354) * add third player moudle * add Audio Waveform Moudle * Fixed an issue where remote users joined player pause when rtc and player were used at the same time * fix ui issue * update content inspect config * add Feature Available On Device * add take snapshot ex * update beautyAPI version to 1.0.3 * add Feature Available On Device * add snapshot ex * fix snapshot remote bug * [Android]Add AudioRouterPlayer case. * [Android]Add AudioWaveform case. * [Android][Audio]add AudioWaveform case. * [Android]adjust content inspect case. * [Android]Add isFeatureAvailableOnDevice api in VideoProcessExtension. * [Android]Add takesnapshotex for JoinMultipleChannel. * [Android]update beauty api to 1.0.3 and etc. * [Windows]add snapshot for MultiChannel. * [Windows]fix snapshot bug. * fix oc crate stream data bug * fix swift create stream data bug * [Android]fix remote render error when rejoining channel(NMS-15581). * [Android]perfect PushExternalVideoYUV case. * add file sharing key * fix title * fix multi channel bug * fix conent inspect bug * [Windows]fix media player crash. * [Android]perfect MultiVideoSourceTracks case. * fix input token crash bug * fix input token crash bug * [Android]Update readme. (#355) * [Android]add 4K 60fps h265. (#356) * [Android]fix ui bug. * [Android]fix render bug(CSD-59845). * Fix the issue of no sound during AVPlayer playback * [Android]add cases of enableVideoImageSource and setAINSMode api and etc. * [Android]add setAINSMode api case and etc. * add video image push * add AINS Mode * iOS Add AINS Mode * ios add video image push * [Windows]add enableVideoImageSource and setAINSMode api case. * [MacOS]fix audio recording path bug. * fix startAudioRecording path bug * fix audio session change issue * fix video image source issue * .. * fix exit screen leave channel issue * screen shareing auto close system interface * [Android]update rtc verstion and etc. * [Windows]Update rtc verstion. * update SDK version to 4.2.3 --------- Co-authored-by: zhaoyongqiang * Update setLocalAccessPoint note and etc. (#358) * Add gitee sync script. * Update gitee sync script. * Test gitee sync. * Feat/code style android (#366) * [Android]add check style config. * [Android]update check style config. * [Android]update codes with checked style. * [Android]Adapt to 4.3.0 and add audio stream selector for media player. * [Android]Add face capture case and etc. * [Android][Audio]Adapt to 4.3.0 rtc sdk. * [Android]Update rtc version. * update sdk 4.3.0 * iOS update SDK 4.3.0 * add multi audio track case * add face capture case * [Windows]Adapt to 4.3.0, add audio stream selector for mediaplayer and etc. * update sdk version to 4.3.0 * [Windows]add face capture case. * [Android]update cpp include. * [Android]Adapt to 4.3.0 latest version. * [Android]update cpp include. * [Android][Audio]Adapt to 4.3.0 latest version. * [windows]adapte to 4.3.0 latest version and fix bugs. * fix rawVideo takesnpe crash bug * fix rawVideo snap shot bug * fix medipa publish streaming bug * [Android]try to fix NMS-17758 and etc. * [Android]fix bugs. * [Android]add background recording function for JoinChannelAudio case(ADC-4803). * [Android][Audio]add background recording function for JoinChannelAudio case(ADC-4803) and perfect case. * [Android]Adapt to 4.3.0 latest version. * [Android]fix custom audio render api call problem(DEVEX-65). * fix multiChannel take snapshot bug * fix custom video render bug * update startEchoTest * [Windows] fix windows screen share bug. * update sdk document address * [Windows]Add audio echo test user case. * [Android]Perfect basic video case and fix some bugs. * [Android]Perfect rtmp streaming case. * Add Camera Test Fuction * [Android]Fix auido route bug. * [Android]Add video echo test. * [Android]fix echo test bug. * [Android]perfect echo test example. * add rtc connection state callback * fix MediaPlayer publish stream UI bug * fix custom capture Video push bug * Fix the issue of self rendering on the Inter chip * [windows]fix spatial audio zone bug. * fix custom video push bug * update precall test bug * [Windows]add video echo test api example. * [Windows]add basic join channel video by token example. * [Windows]fix spatial audio zone bug(NMS-18784) * update start echo test * update spatial audio method * [Android]fix audio problem. * [Android][Audio]fix audio problem. * update spatial audio remote user position * Optimize the code format * [Android]fix NMS-19192/NMS-19194. * [Android][Audio]fix NMS-19192. * adapter video extension for v430 (#370) Co-authored-by: Qiming Deng * update rtc sdk pod version to 4.3.0 * [windows]Update sdk download url. --------- Co-authored-by: zhaoyongqiang Co-authored-by: cleven <543069316@qq.com> Co-authored-by: sync2gitee Co-authored-by: DengQiming-private <71064074+DengQiming-private@users.noreply.github.com> Co-authored-by: Qiming Deng --- .githooks/pre-commit | 38 + .github/workflows/gitee-sync.yml | 2 +- Android/APIExample-Audio/app/build.gradle | 2 +- .../app/src/main/AndroidManifest.xml | 6 + .../api/example/common/BaseFragment.java | 121 +- .../example/common/model/StatisticsInfo.java | 16 +- .../common/widget/AudioSeatManager.java | 76 +- .../examples/advanced/SpatialSound.java | 69 +- .../customaudio/CustomAudioRender.java | 120 +- .../examples/basic/JoinChannelAudio.java | 257 +- .../res/layout/fragment_spatial_sound.xml | 4 +- .../app/src/main/res/values-zh/strings.xml | 3 +- .../app/src/main/res/values/arrays.xml | 1 - .../app/src/main/res/values/strings.xml | 3 +- Android/APIExample/README.md | 29 +- Android/APIExample/README.zh.md | 27 +- .../agora-simple-filter/build.gradle | 4 +- .../src/main/cpp/AgoraRtcKit/AgoraBase.h | 593 +- .../AgoraRtcKit/AgoraExtensionProviderEntry.h | 32 +- .../src/main/cpp/AgoraRtcKit/AgoraMediaBase.h | 100 +- .../cpp/AgoraRtcKit/AgoraMediaPlayerTypes.h | 109 +- .../cpp/AgoraRtcKit/AgoraRefCountedObject.h | 8 + .../main/cpp/AgoraRtcKit/IAgoraFileUploader.h | 5 +- .../src/main/cpp/AgoraRtcKit/IAgoraLog.h | 4 +- .../cpp/AgoraRtcKit/IAgoraMediaPlayerSource.h | 74 +- .../main/cpp/AgoraRtcKit/IAgoraParameter.h | 5 +- .../main/cpp/AgoraRtcKit/IAgoraRtmService.h | 19 +- .../AgoraRtcKit/IAgoraRtmpStreamingService.h | 17 +- .../src/main/cpp/AgoraRtcKit/IAgoraService.h | 54 +- .../AgoraRtcKit/NGIAgoraAudioDeviceManager.h | 43 +- .../main/cpp/AgoraRtcKit/NGIAgoraAudioTrack.h | 49 +- .../cpp/AgoraRtcKit/NGIAgoraCameraCapturer.h | 29 +- .../cpp/AgoraRtcKit/NGIAgoraDataChannel.h | 13 +- .../AgoraRtcKit/NGIAgoraExtensionControl.h | 3 +- .../main/cpp/AgoraRtcKit/NGIAgoraLocalUser.h | 163 +- .../main/cpp/AgoraRtcKit/NGIAgoraMediaNode.h | 41 +- .../AgoraRtcKit/NGIAgoraMediaNodeFactory.h | 14 +- .../NGIAgoraRemoteAudioMixerSource.h | 5 +- .../cpp/AgoraRtcKit/NGIAgoraRtcConnection.h | 46 +- .../cpp/AgoraRtcKit/NGIAgoraRtmpConnection.h | 7 +- .../cpp/AgoraRtcKit/NGIAgoraRtmpLocalUser.h | 60 +- .../cpp/AgoraRtcKit/NGIAgoraScreenCapturer.h | 21 +- .../main/cpp/AgoraRtcKit/NGIAgoraVideoFrame.h | 3 +- .../AgoraRtcKit/NGIAgoraVideoMixerSource.h | 24 +- .../main/cpp/AgoraRtcKit/NGIAgoraVideoTrack.h | 37 +- .../src/main/cpp/AgoraRtcKit/api/ahpl_ares.h | 77 + .../src/main/cpp/AgoraRtcKit/api/ahpl_defs.h | 171 + .../src/main/cpp/AgoraRtcKit/api/ahpl_poll.h | 51 + .../src/main/cpp/AgoraRtcKit/api/ahpl_ref.h | 230 + .../src/main/cpp/AgoraRtcKit/api/ahpl_types.h | 96 + .../cpp/AgoraRtcKit/api/cpp/ahpl_ares_class.h | 91 + .../cpp/AgoraRtcKit/api/cpp/ahpl_poll_class.h | 129 + .../cpp/AgoraRtcKit/api/cpp/ahpl_ref_class.h | 1002 ++ .../src/main/cpp/CMakeLists.txt | 10 +- .../cpp/plugin_source_code/VideoProcessor.cpp | 30 +- .../cpp/plugin_source_code/VideoProcessor.h | 4 + .../agora-stream-encrypt/.gitignore | 1 + .../agora-stream-encrypt/build.gradle | 48 + .../agora-stream-encrypt/consumer-rules.pro | 0 .../agora-stream-encrypt/proguard-rules.pro | 21 + .../src/main/AndroidManifest.xml | 3 + .../agora-stream-encrypt/src/main/agoraLibs | 1 + .../src/main/cpp/CMakeLists.txt | 72 + .../src/main/cpp/include/agora/AgoraBase.h | 6211 ++++++++++++ .../main/cpp/include/agora/AgoraMediaBase.h | 1671 ++++ .../cpp/include/agora/AgoraMediaPlayerTypes.h | 516 + .../main/cpp/include/agora/AgoraOptional.h | 891 ++ .../src/main/cpp/include/agora/AgoraRefPtr.h | 156 + .../cpp/include/agora/IAgoraH265Transcoder.h | 178 + .../src/main/cpp/include/agora/IAgoraLog.h | 98 + .../agora/IAgoraMediaComponentFactory.h | 41 + .../cpp/include/agora/IAgoraMediaEngine.h | 270 + .../cpp/include/agora/IAgoraMediaPlayer.h | 644 ++ .../include/agora/IAgoraMediaPlayerSource.h | 504 + .../cpp/include/agora/IAgoraMediaRecorder.h | 90 + .../agora/IAgoraMediaStreamingSource.h | 332 + .../include/agora/IAgoraMusicContentCenter.h | 538 ++ .../main/cpp/include/agora/IAgoraParameter.h | 310 + .../cpp/include/agora/IAgoraRhythmPlayer.h | 92 + .../main/cpp/include/agora/IAgoraRtcEngine.h | 8471 +++++++++++++++++ .../cpp/include/agora/IAgoraRtcEngineEx.h | 1961 ++++ .../cpp/include/agora/IAgoraSpatialAudio.h | 302 + .../cpp/include/agora/IAudioDeviceManager.h | 482 + .../src/main/cpp/include/agora/time_utils.h | 85 + .../include/packet_processing_plugin_jni.h | 12 + .../main/cpp/packet_processing_plugin_jni.cpp | 174 + .../api/streamencrypt/PacketProcessor.java | 18 + Android/APIExample/app/build.gradle | 6 +- .../app/src/main/AndroidManifest.xml | 3 + .../io/agora/api/example/MainApplication.java | 13 +- .../io/agora/api/example/MainFragment.java | 17 +- .../io/agora/api/example/ReadyFragment.java | 5 +- .../io/agora/api/example/SettingActivity.java | 38 +- .../agora/api/example/annotation/Example.java | 3 - .../example/common/BaseBrowserFragment.java | 16 +- .../api/example/common/BaseFragment.java | 50 +- .../io/agora/api/example/common/Constant.java | 45 +- .../common/adapter/ExampleSection.java | 27 + .../common/floatwindow/AVCallFloatView.java | 62 +- .../common/floatwindow/FloatWindowHelper.java | 52 +- .../common/floatwindow/rom/HuaweiUtils.java | 29 +- .../common/floatwindow/rom/MeizuUtils.java | 19 +- .../common/floatwindow/rom/MiuiUtils.java | 40 +- .../common/floatwindow/rom/OppoUtils.java | 19 +- .../common/floatwindow/rom/QikuUtils.java | 20 +- .../common/floatwindow/rom/RomUtils.java | 54 +- .../example/common/gles/Drawable2dFull.java | 7 +- .../common/gles/Drawable2dLandmarks.java | 33 - .../api/example/common/gles/GLTestUtils.java | 125 - .../api/example/common/gles/GLThread.java | 3 + .../example/common/gles/ProgramLandmarks.java | 143 - .../example/common/gles/ProgramTexture2d.java | 124 - .../common/gles/ProgramTextureOES.java | 35 +- .../example/common/gles/core/Drawable2d.java | 50 +- .../api/example/common/gles/core/EglCore.java | 63 +- .../common/gles/core/EglSurfaceBase.java | 32 +- .../example/common/gles/core/Extensions.java | 23 + .../api/example/common/gles/core/GlUtil.java | 35 +- .../common/gles/core/OffscreenSurface.java | 39 - .../api/example/common/gles/core/Program.java | 67 +- .../common/gles/core/WindowSurface.java | 95 - .../api/example/common/model/ExampleBean.java | 72 + .../api/example/common/model/Examples.java | 26 +- .../example/common/model/GlobalSettings.java | 96 +- .../agora/api/example/common/model/Peer.java | 16 - .../example/common/model/StatisticsInfo.java | 94 +- .../common/widget/AudioOnlyLayout.java | 42 +- .../common/widget/AudioSeatManager.java | 76 +- .../common/widget/VideoReportLayout.java | 66 +- .../example/common/widget/WaveformView.java | 111 +- .../CDNStreaming/AudienceFragment.java | 46 +- .../advanced/CDNStreaming/EntryFragment.java | 38 +- .../advanced/CDNStreaming/HostFragment.java | 37 +- .../examples/advanced/ChannelEncryption.java | 193 +- .../examples/advanced/ContentInspect.java | 27 +- .../advanced/CustomRemoteVideoRender.java | 81 +- .../examples/advanced/FaceCapture.java | 419 + .../examples/advanced/HostAcrossChannel.java | 166 +- .../examples/advanced/InCallReport.java | 142 +- .../advanced/JoinMultipleChannel.java | 54 +- .../examples/advanced/KtvCopyrightMusic.java | 10 +- .../examples/advanced/LiveStreaming.java | 48 +- .../advanced/LocalVideoTranscoding.java | 113 +- .../examples/advanced/MediaPlayer.java | 153 +- .../examples/advanced/MediaRecorder.java | 70 +- .../advanced/MultiVideoSourceTracks.java | 115 +- .../examples/advanced/PictureInPicture.java | 44 +- .../examples/advanced/PlayAudioFiles.java | 179 +- .../examples/advanced/PreCallTest.java | 113 +- .../advanced/ProcessAudioRawData.java | 64 +- .../examples/advanced/ProcessRawData.java | 94 +- .../examples/advanced/PushExternalVideo.java | 81 +- .../advanced/PushExternalVideoYUV.java | 77 +- .../examples/advanced/RTMPStreaming.java | 232 +- .../examples/advanced/RhythmPlayer.java | 121 +- .../examples/advanced/ScreenSharing.java | 30 +- .../examples/advanced/SendDataStream.java | 142 +- .../examples/advanced/SimpleExtension.java | 82 +- .../examples/advanced/SpatialSound.java | 65 +- .../advanced/SwitchCameraScreenShare.java | 73 +- .../examples/advanced/ThirdPartyBeauty.java | 3 + .../examples/advanced/VideoMetadata.java | 160 +- .../advanced/VideoProcessExtension.java | 162 +- .../examples/advanced/VideoQuickSwitch.java | 262 +- .../examples/advanced/VoiceEffects.java | 52 +- .../advanced/beauty/ByteDanceBeauty.java | 58 +- .../advanced/beauty/ByteDanceBeautySDK.kt | 76 +- .../advanced/beauty/FaceUnityBeauty.java | 20 +- .../advanced/beauty/FaceUnityBeautySDK.kt | 57 +- .../advanced/beauty/SenseTimeBeauty.java | 43 +- .../advanced/beauty/SenseTimeBeautySDK.kt | 78 +- .../advanced/customaudio/AudioPlayer.java | 76 +- .../customaudio/CustomAudioRender.java | 120 +- .../customaudio/CustomAudioSource.java | 69 +- .../advanced/videoRender/GLTextureView.java | 467 +- .../advanced/videoRender/YuvFboProgram.java | 25 +- .../advanced/videoRender/YuvUploader.java | 18 + .../examples/audio/AudioRouterPlayer.java | 5 +- .../examples/audio/AudioRouterPlayerExo.java | 44 +- .../examples/audio/AudioRouterPlayerIjk.java | 44 +- .../audio/AudioRouterPlayerNative.java | 44 +- .../example/examples/audio/AudioWaveform.java | 3 + .../examples/basic/JoinChannelAudio.java | 258 +- .../examples/basic/JoinChannelVideo.java | 164 +- .../basic/JoinChannelVideoByToken.java | 97 +- .../api/example/utils/AudioFileReader.java | 60 +- .../agora/api/example/utils/ClassUtils.java | 174 +- .../agora/api/example/utils/CommonUtil.java | 17 +- .../example/utils/DefaultPoolExecutor.java | 55 +- .../example/utils/DefaultThreadFactory.java | 33 +- .../io/agora/api/example/utils/FileKtUtils.kt | 25 +- .../io/agora/api/example/utils/FileUtils.java | 37 +- .../io/agora/api/example/utils/TextUtils.java | 14 +- .../agora/api/example/utils/TokenUtils.java | 59 +- .../api/example/utils/VideoFileReader.java | 55 +- .../io/agora/api/example/utils/YUVUtils.java | 153 +- .../beautyapi/bytedance/ByteDanceBeautyAPI.kt | 167 + .../bytedance/ByteDanceBeautyAPIImpl.kt | 136 +- .../bytedance/utils/AgoraImageHelper.kt | 19 + .../bytedance/utils/GLTestUtils.java | 34 +- .../beautyapi/bytedance/utils/ImageUtil.java | 443 +- .../beautyapi/bytedance/utils/LogUtils.kt | 38 + .../beautyapi/bytedance/utils/StatsHelper.kt | 16 + .../bytedance/utils/opengl/Drawable2d.java | 151 +- .../bytedance/utils/opengl/Extensions.java | 23 + .../bytedance/utils/opengl/GlUtil.java | 245 +- .../bytedance/utils/opengl/Program.java | 124 +- .../utils/opengl/ProgramManager.java | 31 +- .../utils/opengl/ProgramTexture2d.java | 57 +- .../utils/opengl/ProgramTextureOES.java | 59 +- .../utils/opengl/ProgramTextureYUV.java | 88 +- .../beautyapi/faceunity/FaceUnityBeautyAPI.kt | 164 + .../faceunity/FaceUnityBeautyAPIImpl.kt | 126 + .../faceunity/utils/FuDeviceUtils.java | 112 +- .../beautyapi/faceunity/utils/LogUtils.kt | 38 + .../beautyapi/faceunity/utils/StatsHelper.kt | 16 + .../faceunity/utils/egl/EGLContextHelper.java | 94 +- .../faceunity/utils/egl/GLCopyHelper.java | 39 +- .../faceunity/utils/egl/GLFrameBuffer.java | 105 +- .../utils/egl/GLTextureBufferQueue.kt | 54 + .../faceunity/utils/egl/GLUtils.java | 75 +- .../utils/egl/TextureProcessHelper.kt | 43 + .../beautyapi/sensetime/SenseTimeBeautyAPI.kt | 171 + .../sensetime/SenseTimeBeautyAPIImpl.kt | 139 + .../beautyapi/sensetime/utils/LogUtils.kt | 38 + .../beautyapi/sensetime/utils/StatsHelper.kt | 16 + .../sensetime/utils/egl/GLCopyHelper.java | 39 +- .../sensetime/utils/egl/GLFrameBuffer.java | 105 +- .../sensetime/utils/egl/GLTestUtils.java | 35 +- .../utils/egl/GLTextureBufferQueue.kt | 54 + .../beautyapi/sensetime/utils/egl/GlUtil.java | 103 +- .../utils/processor/Accelerometer.java | 72 +- .../utils/processor/BeautyProcessor.kt | 40 +- .../sensetime/utils/processor/FaceDetector.kt | 63 +- .../utils/processor/IBeautyProcessor.kt | 72 +- .../main/res/layout/fragment_face_capture.xml | 59 + .../main/res/layout/fragment_media_player.xml | 98 +- .../main/res/layout/fragment_precall_test.xml | 34 +- .../res/layout/fragment_spatial_sound.xml | 4 +- .../app/src/main/res/navigation/nav_graph.xml | 9 + .../app/src/main/res/values-zh/strings.xml | 11 +- .../app/src/main/res/values/arrays.xml | 2 +- .../app/src/main/res/values/strings.xml | 11 +- Android/APIExample/build.gradle | 6 + Android/APIExample/checkstyle.gradle | 55 + Android/APIExample/checkstyle.xml | 230 + Android/APIExample/detekt-baseline.xml | 8 + Android/APIExample/detekt-config.yml | 751 ++ Android/APIExample/detekt.gradle | 55 + Android/APIExample/git-hooks.gradle | 5 + Android/APIExample/gradle.properties | 7 +- Android/APIExample/settings.gradle | 3 + .../Advanced/AudioMixing/AudioMixing.swift | 8 +- .../CustomAudioRender/CustomAudioRender.swift | 8 +- .../CustomAudioSource/CustomAudioSource.swift | 8 +- .../CustomPcmAudioSource.swift | 8 +- .../Advanced/PrecallTest/PrecallTest.swift | 10 +- .../Advanced/RawAudioData/RawAudioData.swift | 8 +- .../Advanced/RhythmPlayer/RhythmPlayer.swift | 8 +- .../Advanced/SpatialAudio/SpatialAudio.swift | 40 +- .../Advanced/VoiceChanger/VoiceChanger.swift | 11 +- .../JoinChannelAudioToken.swift | 8 +- .../JoinChannelAudio/JoinChannelAudio.swift | 8 +- iOS/APIExample-Audio/Podfile | 4 +- .../ExternalVideo/AgoraMetalRender.swift | 5 +- .../Advanced/AudioMixing/AudioMixing.m | 11 +- .../Advanced/ContentInspect/ContentInspect.m | 8 +- .../CreateDataStream/CreateDataStream.m | 8 +- .../CustomAudioRender/CustomAudioRender.m | 8 +- .../CustomPcmAudioSource.m | 8 +- .../CustomVideoRender/CustomVideoRender.m | 8 +- .../CustomVideoSourcePush.m | 15 +- .../Examples/Advanced/FusionCDN/FusionCDN.m | 30 +- .../JoinMultiChannel/JoinMultiChannel.m | 16 +- .../Advanced/LiveStreaming/LiveStreaming.m | 10 +- .../MediaChannelRelay/MediaChannelRelay.m | 8 +- .../Advanced/MediaPlayer/MediaPlayer.m | 18 +- .../Advanced/MutliCamera/MutliCamera.m | 8 +- .../PictureInPicture/PictureInPicture.m | 8 +- .../Advanced/RTMPStreaming/RTMPStreaming.m | 12 +- .../Advanced/RawAudioData/RawAudioData.m | 8 +- .../Advanced/RawVideoData/RawVideoData.m | 12 +- .../Advanced/RhythmPlayer/RhythmPlayer.m | 8 +- .../Advanced/ScreenShare/ScreenShare.m | 11 +- .../Advanced/SimpleFilter/SimpleFilter.m | 8 +- .../Advanced/SpatialAudio/SpatialAudio.m | 54 +- .../StreamEncryption/StreamEncryption.m | 8 +- .../Advanced/VideoMetadata/VideoMetadata.m | 8 +- .../Advanced/VideoProcess/VideoProcess.m | 8 +- .../Advanced/VoiceChanger/VoiceChanger.m | 8 +- .../Basic/JoinChannelAudio/JoinChannelAudio.m | 8 +- .../JoinChannelVideoRecorder.m | 15 +- .../JoinChannelVideoToken.m | 8 +- .../Basic/JoinChannelVideo/JoinChannelVideo.m | 8 +- iOS/APIExample-OC/Podfile | 12 +- iOS/APIExample/.swiftlint.yml | 40 + .../APIExample.xcodeproj/project.pbxproj | 58 +- iOS/APIExample/APIExample/AppDelegate.swift | 3 - .../Common/ARKit/ARVideoRenderer.swift | 2 +- .../APIExample/Common/AgoraExtension.swift | 193 +- .../APIExample/Common/AlertManager.swift | 52 +- .../Common/BaseViewController.swift | 40 +- .../Common/EntryViewController.swift | 17 +- .../ExternalAudio/AgoraPcmSourcePush.swift | 2 +- .../Common/ExternalAudio/ExternalAudio.h | 7 +- .../Common/ExternalAudio/ExternalAudio.mm | 15 +- .../Common/ExternalAudio/ZSNBoxingView.m | 2 +- .../ExternalVideo/AgoraCameraSourcePush.swift | 11 +- .../ExternalVideo/AgoraMetalRender.swift | 46 +- .../APIExample/Common/GlobalSettings.swift | 28 +- .../APIExample/Common/KeyCenter.swift | 7 +- .../APIExample/Common/LogViewController.swift | 16 +- .../Common/NetworkManager/JSONObject.swift | 42 +- .../NetworkManager/NetworkManager.swift | 22 +- .../Common/NetworkManager/ToastView.swift | 9 +- .../APIExample/Common/PickerView.swift | 2 +- .../Common/Settings/SettingsCells.swift | 95 +- .../Settings/SettingsViewController.swift | 15 +- .../APIExample/Common/StatisticsInfo.swift | 34 +- .../APIExample/Common/UITypeAlias.swift | 105 +- .../APIExample/Common/Utils/Util.swift | 16 +- .../APIExample/Common/VideoView.swift | 48 +- .../Examples/Advanced/ARKit/ARKit.swift | 63 +- .../Advanced/AudioMixing/AudioMixing.swift | 140 +- .../AudioWaveform/AudioWaveform.swift | 64 +- .../AuidoRouterPlayer/AuidoRouterPlayer.swift | 77 +- .../ContentInspect/ContentInspect.swift | 19 +- .../CreateDataStream/CreateDataStream.swift | 52 +- .../CustomAudioRender/CustomAudioRender.swift | 38 +- .../CustomAudioSource/CustomAudioSource.swift | 52 +- .../CustomPcmAudioSource.swift | 26 +- .../CustomVideoRender/CustomVideoRender.swift | 28 +- .../CustomVideoSourcePush.swift | 36 +- .../CustomVideoSourcePushMulti.swift | 49 +- .../Base.lproj/FaceCapture.storyboard | 100 + .../Advanced/FaceCapture/FaceCapture.swift | 290 + .../zh-Hans.lproj/FaceCapture.strings | 21 + .../Advanced/FusionCDN/FusionCDN.swift | 178 +- .../JoinMultiChannel/JoinMultiChannel.swift | 50 +- .../KtvCopyrightMusic/KtvCopyrightMusic.swift | 1 - .../LiveStreaming/LiveStreaming.swift | 84 +- .../MediaChannelRelay/MediaChannelRelay.swift | 45 +- .../Base.lproj/MediaPlayer.storyboard | 46 +- .../Advanced/MediaPlayer/MediaPlayer.swift | 150 +- .../zh-Hans.lproj/MediaPlayer.strings | 5 + .../Advanced/MutliCamera/MutliCamera.swift | 22 +- .../PictureInPicture/PictureInPicture.swift | 81 +- .../Base.lproj/PrecallTest.storyboard | 70 +- .../Advanced/PrecallTest/PrecallTest.swift | 98 +- .../zh-Hans.lproj/PrecallTest.strings | 4 +- .../QuickSwitchChannel.swift | 50 +- .../RTMPStreaming/RTMPStreaming.swift | 46 +- .../Advanced/RawAudioData/RawAudioData.swift | 8 +- .../Advanced/RawMediaData/RawMediaData.swift | 18 +- .../Advanced/RawVideoData/RawVideoData.swift | 34 +- .../Advanced/RhythmPlayer/RhythmPlayer.swift | 40 +- .../Advanced/ScreenShare/ScreenShare.swift | 40 +- .../Advanced/SimpleFilter/SimpleFilter.swift | 48 +- .../Advanced/SpatialAudio/SpatialAudio.swift | 54 +- .../StreamEncryption/StreamEncryption.swift | 66 +- .../ThirdBeautify/ThirdBeautify.swift | 2 +- .../Advanced/VideoChat/VideoChat.swift | 49 +- .../VideoMetadata/VideoMetadata.swift | 35 +- .../Advanced/VideoProcess/VideoProcess.swift | 57 +- .../Advanced/VoiceChanger/VoiceChanger.swift | 199 +- .../JoinChannelAudio/JoinChannelAudio.swift | 90 +- .../JoinChannelVideoRecorder.swift | 96 +- .../JoinChannelVideoToken.swift | 43 +- .../JoinChannelVideo/JoinChannelVideo.swift | 63 +- iOS/APIExample/APIExample/Info.plist | 5 - .../APIExample/ViewController.swift | 91 +- .../zh-Hans.lproj/Localizable.strings | 2 + .../SampleHandler.swift | 2 +- iOS/APIExample/Podfile | 13 +- macOS/APIExample.xcodeproj/project.pbxproj | 40 +- .../ExternalVideo/AgoraMetalRender.swift | 125 +- macOS/APIExample/Common/KeyCenter.swift | 9 +- macOS/APIExample/Common/StatisticsInfo.swift | 8 + macOS/APIExample/Common/VideoView.swift | 8 +- macOS/APIExample/Common/VideoView.xib | 22 +- .../Advanced/AudioMixing/AudioMixing.swift | 17 +- .../ChannelMediaRelay/ChannelMediaRelay.swift | 10 +- .../ContentInspect/ContentInspect.swift | 8 +- .../CreateDataStream/CreateDataStream.swift | 8 +- .../CustomAudioRender/CustomAudioRender.swift | 8 +- .../CustomAudioSource/CustomAudioSource.swift | 8 +- .../CustomVideoRender/CustomVideoRender.swift | 8 +- .../CustomVideoSourcePush.swift | 17 +- .../CustomVideoSourcePushMulti.swift | 14 +- .../Base.lproj/FaceCapture.storyboard | 132 + .../Advanced/FaceCapture/FaceCapture.swift | 464 + .../zh-Hans.lproj/FaceCapture.strings | 24 + .../JoinMultiChannel/JoinMultiChannel.swift | 18 +- .../LiveStreaming/LiveStreaming.swift | 12 +- .../LocalCompositeGraph.swift | 16 +- .../Base.lproj/MediaPlayer.storyboard | 65 +- .../Advanced/MediaPlayer/MediaPlayer.swift | 111 +- .../en.lproj/MediaPlayer.storyboard | 43 +- .../zh-Hans.lproj/MediaPlayer.strings | 5 + .../MultiCameraSourece.swift | 16 +- .../Base.lproj/PrecallTest.storyboard | 51 +- .../Advanced/PrecallTest/PrecallTest.swift | 41 +- .../zh-Hans.lproj/PrecallTest.strings | 2 + .../QuickSwitchChannel.swift | 4 +- .../RTMPStreaming/RTMPStreaming.swift | 12 +- .../Advanced/RawAudioData/RawAudioData.swift | 8 +- .../Advanced/RawVideoData/RawVideoData.swift | 8 +- .../Advanced/ScreenShare/ScreenShare.swift | 16 +- .../Advanced/SimpleFilter/SimpleFilter.swift | 12 +- .../Advanced/SpatialAudio/SpatialAudio.swift | 28 +- .../StreamEncryption/StreamEncryption.swift | 8 +- .../Advanced/VideoProcess/VideoProcess.swift | 8 +- .../Advanced/VoiceChanger/VoiceChanger.swift | 11 +- .../JoinChannelAudio/JoinChannelAudio.swift | 11 +- .../JoinChannelVideoRecorder.swift | 15 +- .../JoinChannelVideoToken.swift | 14 +- .../JoinChannelVideo/JoinChannelVideo.swift | 12 +- macOS/APIExample/ViewController.swift | 1 + .../zh-Hans.lproj/Localizable.strings | 4 + macOS/Podfile | 8 +- windows/APIExample/APIExample/APIExample.rc | 125 +- .../APIExample/APIExample/APIExample.vcxproj | 4 + .../APIExample/APIExample.vcxproj.filters | 18 + .../APIExample/APIExample/APIExampleDlg.cpp | 28 +- windows/APIExample/APIExample/APIExampleDlg.h | 6 +- .../APIExample/Advanced/Beauty/CDlgBeauty.cpp | 1 + .../CAgoraCaptureAudioDlg.cpp | 9 +- .../FaceCapture/CAgoraFaceCaptureDlg.cpp | 550 ++ .../FaceCapture/CAgoraFaceCaptureDlg.h | 199 + .../MediaPlayer/CAgoraMediaPlayer.cpp | 102 +- .../Advanced/MediaPlayer/CAgoraMediaPlayer.h | 14 +- .../MediaRecorder/CAgoraMediaRecorder.h | 2 +- .../MultiChannel/CAgoraMultiChannelDlg.cpp | 4 +- .../MultiChannel/CAgoraMultiChannelDlg.h | 2 +- .../PreCallTest/CAgoraPreCallTestDlg.cpp | 92 + .../PreCallTest/CAgoraPreCallTestDlg.h | 14 + .../RTMPStream/AgoraRtmpStreaming.cpp | 24 +- .../Advanced/RTMPStream/AgoraRtmpStreaming.h | 2 +- .../ReportInCall/CAgoraReportInCallDlg.h | 4 +- .../ScreenShare/AgoraScreenCapture.cpp | 14 +- .../SpatialAudio/CAgoraSpatialAudioDlg.cpp | 51 +- .../SpatialAudio/CAgoraSpatialAudioDlg.h | 12 +- .../CJoinChannelVideoByTokenDlg.cpp | 729 ++ .../CJoinChannelVideoByTokenDlg.h | 212 + .../LiveBroadcasting/CLiveBroadcastingDlg.cpp | 4 +- .../LiveBroadcasting/CLiveBroadcastingDlg.h | 4 +- windows/APIExample/APIExample/Language.h | 22 +- windows/APIExample/APIExample/en.ini | 32 +- windows/APIExample/APIExample/resource.h | 17 +- windows/APIExample/APIExample/stdafx.cpp | 50 +- windows/APIExample/APIExample/stdafx.h | 10 +- windows/APIExample/APIExample/zh-cn.ini | 23 +- windows/APIExample/install.ps1 | 2 +- 453 files changed, 42673 insertions(+), 7079 deletions(-) create mode 100755 .githooks/pre-commit create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ares.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_defs.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_poll.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ref.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_types.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ares_class.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_poll_class.h create mode 100644 Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ref_class.h create mode 100644 Android/APIExample/agora-stream-encrypt/.gitignore create mode 100644 Android/APIExample/agora-stream-encrypt/build.gradle create mode 100644 Android/APIExample/agora-stream-encrypt/consumer-rules.pro create mode 100644 Android/APIExample/agora-stream-encrypt/proguard-rules.pro create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/AndroidManifest.xml create mode 120000 Android/APIExample/agora-stream-encrypt/src/main/agoraLibs create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/CMakeLists.txt create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraBase.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaBase.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaPlayerTypes.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraOptional.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraRefPtr.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraH265Transcoder.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraLog.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaComponentFactory.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaEngine.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayer.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayerSource.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaRecorder.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaStreamingSource.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMusicContentCenter.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraParameter.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRhythmPlayer.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngine.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngineEx.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraSpatialAudio.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAudioDeviceManager.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/time_utils.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/include/packet_processing_plugin_jni.h create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/cpp/packet_processing_plugin_jni.cpp create mode 100644 Android/APIExample/agora-stream-encrypt/src/main/java/io/agora/api/streamencrypt/PacketProcessor.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dLandmarks.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLTestUtils.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramLandmarks.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTexture2d.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/OffscreenSurface.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/WindowSurface.java delete mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Peer.java create mode 100644 Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/FaceCapture.java create mode 100644 Android/APIExample/app/src/main/res/layout/fragment_face_capture.xml create mode 100644 Android/APIExample/checkstyle.gradle create mode 100644 Android/APIExample/checkstyle.xml create mode 100644 Android/APIExample/detekt-baseline.xml create mode 100644 Android/APIExample/detekt-config.yml create mode 100644 Android/APIExample/detekt.gradle create mode 100644 Android/APIExample/git-hooks.gradle create mode 100644 iOS/APIExample/.swiftlint.yml create mode 100644 iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/Base.lproj/FaceCapture.storyboard create mode 100644 iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/FaceCapture.swift create mode 100644 iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/zh-Hans.lproj/FaceCapture.strings create mode 100644 macOS/APIExample/Examples/Advanced/FaceCapture/Base.lproj/FaceCapture.storyboard create mode 100644 macOS/APIExample/Examples/Advanced/FaceCapture/FaceCapture.swift create mode 100644 macOS/APIExample/Examples/Advanced/FaceCapture/zh-Hans.lproj/FaceCapture.strings create mode 100644 windows/APIExample/APIExample/Advanced/FaceCapture/CAgoraFaceCaptureDlg.cpp create mode 100644 windows/APIExample/APIExample/Advanced/FaceCapture/CAgoraFaceCaptureDlg.h create mode 100644 windows/APIExample/APIExample/Basic/JoinChannelVideoByToken/CJoinChannelVideoByTokenDlg.cpp create mode 100644 windows/APIExample/APIExample/Basic/JoinChannelVideoByToken/CJoinChannelVideoByTokenDlg.h diff --git a/.githooks/pre-commit b/.githooks/pre-commit new file mode 100755 index 000000000..d662cda96 --- /dev/null +++ b/.githooks/pre-commit @@ -0,0 +1,38 @@ +#!/bin/sh +# +# An example hook script to verify what is about to be committed. +# Called by "git commit" with no arguments. The hook should +# exit with non-zero status after issuing an appropriate message if +# it wants to stop the commit. +# +# To enable this hook, rename this file to "pre-commit". + +if git rev-parse --verify HEAD >/dev/null 2>&1 +then + against=HEAD +else + # Initial commit: diff against an empty tree object + against=$(git hash-object -t tree /dev/null) +fi + +SCRIPT_DIR=$(dirname "$0") +SCRIPT_ABS_PATH=`cd "$SCRIPT_DIR"; pwd` + + +ANDROID_DIFF_FILES=`git diff --cached --name-only --diff-filter=ACM -- '*' | grep 'Android'` +if [[ "$ANDROID_DIFF_FILES" != "" ]] +then + cd Android/APIExample + echo "precommit >> current paht = $(pwd), diff files = $ANDROID_DIFF_FILES" + ./gradlew -Dorg.gradle.project.commit_diff_files="$ANDROID_DIFF_FILES" checkstyle detekt + if [ $? -eq 0 ]; then + echo "precommit >> checkstyle detekt OK." + else + echo "precommit >> checkstyle detekt Failed." + exit 1 + fi +else + echo "precommit >> No changing android files." +fi + + diff --git a/.github/workflows/gitee-sync.yml b/.github/workflows/gitee-sync.yml index 5f85f0b99..02c9462fe 100644 --- a/.github/workflows/gitee-sync.yml +++ b/.github/workflows/gitee-sync.yml @@ -25,4 +25,4 @@ jobs: dst_token: ${{ secrets.GITEE_PRIVATE_TOKEN }} force_update: true account_type: org - shell_path: ./.github/workflows/gitee-sync-shell.sh + shell_path: ./.github/workflows/gitee-sync-shell.sh \ No newline at end of file diff --git a/Android/APIExample-Audio/app/build.gradle b/Android/APIExample-Audio/app/build.gradle index ccccf79d4..c61ffa95d 100644 --- a/Android/APIExample-Audio/app/build.gradle +++ b/Android/APIExample-Audio/app/build.gradle @@ -48,7 +48,7 @@ dependencies { implementation fileTree(dir: "${localSdkPath}", include: ['*.jar', '*.aar']) } else{ - def agora_sdk_version = "4.2.6" + def agora_sdk_version = "4.3.0" // case 1: full single lib with voice only implementation "io.agora.rtc:voice-sdk:${agora_sdk_version}" // case 2: partial libs with voice only diff --git a/Android/APIExample-Audio/app/src/main/AndroidManifest.xml b/Android/APIExample-Audio/app/src/main/AndroidManifest.xml index 939f096bc..e663d98a8 100644 --- a/Android/APIExample-Audio/app/src/main/AndroidManifest.xml +++ b/Android/APIExample-Audio/app/src/main/AndroidManifest.xml @@ -13,6 +13,9 @@ + + + + + \ No newline at end of file diff --git a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/BaseFragment.java b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/BaseFragment.java index 03fe5e7e5..3dde5fc35 100644 --- a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/BaseFragment.java +++ b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/BaseFragment.java @@ -4,33 +4,72 @@ import android.os.Bundle; import android.os.Handler; import android.os.Looper; -import android.view.Menu; +import android.text.TextUtils; +import android.view.View; import android.widget.Toast; +import androidx.activity.OnBackPressedCallback; import androidx.annotation.NonNull; import androidx.annotation.Nullable; import androidx.appcompat.app.AlertDialog; import androidx.fragment.app.Fragment; +import androidx.navigation.Navigation; -import io.agora.api.example.R; - -public class BaseFragment extends Fragment -{ +/** + * The type Base fragment. + */ +public class BaseFragment extends Fragment { + /** + * The Handler. + */ protected Handler handler; private AlertDialog mAlertDialog; + private String mAlertMessage; + private final OnBackPressedCallback onBackPressedCallback = new OnBackPressedCallback(false) { + @Override + public void handleOnBackPressed() { + onBackPressed(); + } + }; @Override - public void onCreate(@Nullable Bundle savedInstanceState) - { + public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); handler = new Handler(Looper.getMainLooper()); + requireActivity().getOnBackPressedDispatcher().addCallback(onBackPressedCallback); + } + + @Override + public void onAttach(@NonNull Context context) { + super.onAttach(context); + onBackPressedCallback.setEnabled(true); + } + + @Override + public void onDetach() { + super.onDetach(); + onBackPressedCallback.setEnabled(false); } + /** + * Show alert. + * + * @param message the message + */ protected void showAlert(String message) { + this.showAlert(message, true); + } + /** + * Show alert. + * + * @param message the message + * @param showRepeatMsg the show repeat msg + */ + protected void showAlert(String message, boolean showRepeatMsg) { runOnUIThread(() -> { Context context = getContext(); - if(context == null){ + if (context == null) { return; } if (mAlertDialog == null) { @@ -38,44 +77,74 @@ protected void showAlert(String message) { .setPositiveButton("OK", (dialog, which) -> dialog.dismiss()) .create(); } + if (!showRepeatMsg && !TextUtils.isEmpty(mAlertMessage) && mAlertMessage.equals(message)) { + return; + } + mAlertMessage = message; mAlertDialog.setMessage(message); mAlertDialog.show(); }); } - protected final void showLongToast(final String msg) - { + /** + * Reset alert. + */ + protected void resetAlert() { + runOnUIThread(() -> mAlertMessage = ""); + } + + /** + * Show long toast. + * + * @param msg the msg + */ + protected final void showLongToast(final String msg) { runOnUIThread(() -> { Context context = getContext(); - if(context == null){ + if (context == null) { return; } Toast.makeText(context, msg, Toast.LENGTH_LONG).show(); }); } - protected final void showShortToast(final String msg) - { + /** + * Show short toast. + * + * @param msg the msg + */ + protected final void showShortToast(final String msg) { runOnUIThread(() -> { Context context = getContext(); - if(context == null){ + if (context == null) { return; } Toast.makeText(context, msg, Toast.LENGTH_SHORT).show(); }); } - protected final void runOnUIThread(Runnable runnable){ + /** + * Run on ui thread. + * + * @param runnable the runnable + */ + protected final void runOnUIThread(Runnable runnable) { this.runOnUIThread(runnable, 0); } - protected final void runOnUIThread(Runnable runnable, long delay){ - if(handler != null && runnable != null && getContext() != null){ + /** + * Run on ui thread. + * + * @param runnable the runnable + * @param delay the delay + */ + protected final void runOnUIThread(Runnable runnable, long delay) { + if (handler != null && runnable != null && getContext() != null) { if (delay <= 0 && handler.getLooper().getThread() == Thread.currentThread()) { runnable.run(); - }else{ + } else { handler.postDelayed(() -> { - if(getContext() != null){ + if (getContext() != null) { runnable.run(); } }, delay); @@ -87,15 +156,19 @@ protected final void runOnUIThread(Runnable runnable, long delay){ public void onDestroy() { super.onDestroy(); handler.removeCallbacksAndMessages(null); - if(mAlertDialog != null){ + if (mAlertDialog != null) { mAlertDialog.dismiss(); mAlertDialog = null; } } - @Override - public void onPrepareOptionsMenu(@NonNull Menu menu) { - super.onPrepareOptionsMenu(menu); - menu.setGroupVisible(R.id.main_setting_group, false); + /** + * On back pressed. + */ + protected void onBackPressed() { + View view = getView(); + if (view != null) { + Navigation.findNavController(view).navigateUp(); + } } } diff --git a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java index e531ca345..0b41aa1eb 100644 --- a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java +++ b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java @@ -1,26 +1,24 @@ package io.agora.api.example.common.model; -import io.agora.rtc2.IRtcEngineEventHandler.LastmileProbeResult; -import io.agora.rtc2.IRtcEngineEventHandler.LocalAudioStats; +import io.agora.rtc2.IRtcEngineEventHandler; import io.agora.rtc2.IRtcEngineEventHandler.LocalVideoStats; import io.agora.rtc2.IRtcEngineEventHandler.RemoteAudioStats; import io.agora.rtc2.IRtcEngineEventHandler.RemoteVideoStats; -import io.agora.rtc2.IRtcEngineEventHandler.RtcStats; public class StatisticsInfo { private LocalVideoStats localVideoStats = new LocalVideoStats(); - private LocalAudioStats localAudioStats = new LocalAudioStats(); + private IRtcEngineEventHandler.LocalAudioStats localAudioStats = new IRtcEngineEventHandler.LocalAudioStats(); private RemoteVideoStats remoteVideoStats = new RemoteVideoStats(); private RemoteAudioStats remoteAudioStats = new RemoteAudioStats(); - private RtcStats rtcStats = new RtcStats(); + private IRtcEngineEventHandler.RtcStats rtcStats = new IRtcEngineEventHandler.RtcStats(); private int quality; - private LastmileProbeResult lastMileProbeResult; + private IRtcEngineEventHandler.LastmileProbeResult lastMileProbeResult; public void setLocalVideoStats(LocalVideoStats localVideoStats) { this.localVideoStats = localVideoStats; } - public void setLocalAudioStats(LocalAudioStats localAudioStats) { + public void setLocalAudioStats(IRtcEngineEventHandler.LocalAudioStats localAudioStats) { this.localAudioStats = localAudioStats; } @@ -32,7 +30,7 @@ public void setRemoteAudioStats(RemoteAudioStats remoteAudioStats) { this.remoteAudioStats = remoteAudioStats; } - public void setRtcStats(RtcStats rtcStats) { + public void setRtcStats(IRtcEngineEventHandler.RtcStats rtcStats) { this.rtcStats = rtcStats; } @@ -162,7 +160,7 @@ public String getLastMileResult() { return stringBuilder.toString(); } - public void setLastMileProbeResult(LastmileProbeResult lastmileProbeResult) { + public void setLastMileProbeResult(IRtcEngineEventHandler.LastmileProbeResult lastmileProbeResult) { this.lastMileProbeResult = lastmileProbeResult; } diff --git a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java index 05bd94dea..0e1292787 100644 --- a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java +++ b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java @@ -2,17 +2,32 @@ import android.view.View; +import java.util.ArrayList; + +/** + * The type Audio seat manager. + */ public class AudioSeatManager { private final AudioOnlyLayout[] audioOnlyLayouts; - public AudioSeatManager(AudioOnlyLayout... seats){ + /** + * Instantiates a new Audio seat manager. + * + * @param seats the seats + */ + public AudioSeatManager(AudioOnlyLayout... seats) { audioOnlyLayouts = new AudioOnlyLayout[seats.length]; for (int i = 0; i < audioOnlyLayouts.length; i++) { audioOnlyLayouts[i] = seats[i]; } } + /** + * Up local seat. + * + * @param uid the uid + */ public void upLocalSeat(int uid) { AudioOnlyLayout localSeat = audioOnlyLayouts[0]; localSeat.setTag(uid); @@ -20,7 +35,12 @@ public void upLocalSeat(int uid) { localSeat.updateUserInfo(uid + "", true); } - public void upRemoteSeat(int uid){ + /** + * Up remote seat. + * + * @param uid the uid + */ + public void upRemoteSeat(int uid) { AudioOnlyLayout idleSeat = null; for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { if (audioOnlyLayout.getTag() == null) { @@ -28,37 +48,70 @@ public void upRemoteSeat(int uid){ break; } } - if(idleSeat != null){ + if (idleSeat != null) { idleSeat.setTag(uid); idleSeat.setVisibility(View.VISIBLE); idleSeat.updateUserInfo(uid + "", false); } } - public void downSeat(int uid){ + /** + * Get seat remote uid list array list. + * + * @return the array list + */ + public ArrayList getSeatRemoteUidList() { + ArrayList uidList = new ArrayList<>(); + for (int i = 1; i < audioOnlyLayouts.length; i++) { + AudioOnlyLayout audioOnlyLayout = audioOnlyLayouts[i]; + Object tag = audioOnlyLayout.getTag(); + if (tag instanceof Integer) { + uidList.add((Integer) tag); + } + } + return uidList; + } + + /** + * Down seat. + * + * @param uid the uid + */ + public void downSeat(int uid) { AudioOnlyLayout seat = null; for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { Object tag = audioOnlyLayout.getTag(); - if (tag instanceof Integer && (Integer)tag == uid) { + if (tag instanceof Integer && (Integer) tag == uid) { seat = audioOnlyLayout; break; } } - if(seat != null){ + if (seat != null) { seat.setTag(null); seat.setVisibility(View.INVISIBLE); } } - public AudioOnlyLayout getLocalSeat(){ + /** + * Get local seat audio only layout. + * + * @return the audio only layout + */ + public AudioOnlyLayout getLocalSeat() { return audioOnlyLayouts[0]; } - public AudioOnlyLayout getRemoteSeat(int uid){ + /** + * Get remote seat audio only layout. + * + * @param uid the uid + * @return the audio only layout + */ + public AudioOnlyLayout getRemoteSeat(int uid) { AudioOnlyLayout seat = null; for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { Object tag = audioOnlyLayout.getTag(); - if (tag instanceof Integer && (Integer)tag == uid) { + if (tag instanceof Integer && (Integer) tag == uid) { seat = audioOnlyLayout; break; } @@ -66,7 +119,10 @@ public AudioOnlyLayout getRemoteSeat(int uid){ return seat; } - public void downAllSeats(){ + /** + * Down all seats. + */ + public void downAllSeats() { for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { audioOnlyLayout.setTag(null); audioOnlyLayout.setVisibility(View.INVISIBLE); diff --git a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java index b77129bb0..36e5fd7c3 100644 --- a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java +++ b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java @@ -23,6 +23,7 @@ import com.google.android.material.bottomsheet.BottomSheetDialog; +import java.util.Arrays; import java.util.HashMap; import java.util.Map; @@ -36,6 +37,8 @@ import io.agora.mediaplayer.Constants; import io.agora.mediaplayer.IMediaPlayer; import io.agora.mediaplayer.IMediaPlayerObserver; +import io.agora.mediaplayer.data.CacheStatistics; +import io.agora.mediaplayer.data.PlayerPlaybackStats; import io.agora.mediaplayer.data.PlayerUpdatedInfo; import io.agora.mediaplayer.data.SrcInfo; import io.agora.rtc2.ChannelMediaOptions; @@ -50,6 +53,9 @@ import io.agora.spatialaudio.RemoteVoicePositionInfo; import io.agora.spatialaudio.SpatialAudioZone; +/** + * The type Spatial sound. + */ @Example( index = 22, group = ADVANCED, @@ -92,7 +98,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { return; } try { - /**Creates an RtcEngine instance. + /*Creates an RtcEngine instance. * @param context The context of Android Activity * @param appId The App ID issued to you by Agora. See * How to get the App ID @@ -105,7 +111,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mEventHandler = iRtcEngineEventHandler; config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -130,7 +136,6 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { localSpatialAudioConfig.mRtcEngine = engine; localSpatial.initialize(localSpatialAudioConfig); - //localSpatial.muteAllRemoteAudioStreams(true); localSpatial.setMaxAudioRecvCount(2); localSpatial.setAudioRecvRange(AXIS_MAX_DISTANCE); localSpatial.setDistanceUnit(1); @@ -176,6 +181,7 @@ protected void onPositionChanged() { float[] forward = new float[]{1.0F, 0.0F, 0.0F}; float[] right = new float[]{0.0F, 1.0F, 0.0F}; float[] up = new float[]{0.0F, 0.0F, 1.0F}; + Log.d(TAG, "updateSelfPosition >> pos=" + Arrays.toString(pos)); localSpatial.updateSelfPosition(pos, forward, right, up); } }); @@ -199,15 +205,9 @@ protected void onPositionChanged() { mediaPlayerLeftZone.rightLength = viewRelativeSizeInAxis[0]; mediaPlayerLeftZone.upLength = AXIS_MAX_DISTANCE; localSpatial.setZones(new SpatialAudioZone[]{mediaPlayerLeftZone}); - localSpatial.updatePlayerPositionInfo(mediaPlayerLeft.getMediaPlayerId(), getVoicePositionInfo(mediaPlayerLeftIv)); } else { zoneTv.setVisibility(View.INVISIBLE); - SpatialAudioZone worldZone = new SpatialAudioZone(); - worldZone.upLength = AXIS_MAX_DISTANCE * 2; - worldZone.forwardLength = AXIS_MAX_DISTANCE * 2; - worldZone.rightLength = AXIS_MAX_DISTANCE * 2; - localSpatial.setZones(new SpatialAudioZone[]{worldZone}); - localSpatial.updatePlayerPositionInfo(mediaPlayerLeft.getMediaPlayerId(), getVoicePositionInfo(mediaPlayerLeftIv)); + localSpatial.setZones(null); } }); } @@ -219,7 +219,7 @@ private void joinChannel() { engine.setClientRole(io.agora.rtc2.Constants.CLIENT_ROLE_BROADCASTER); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -229,7 +229,7 @@ private void joinChannel() { ChannelMediaOptions option = new ChannelMediaOptions(); option.autoSubscribeAudio = true; - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -299,7 +299,7 @@ private void initMediaPlayers() { private void showMediaPlayerSettingDialog(IMediaPlayer mediaPlayer) { String key = "MediaPlayer_" + mediaPlayer.getMediaPlayerId(); BottomSheetDialog dialog = cacheDialogs.get(key); - if(dialog != null){ + if (dialog != null) { dialog.show(); return; } @@ -346,7 +346,7 @@ public void onStopTrackingTouch(SeekBar seekBar) { private void showRemoteUserSettingDialog(int uid) { String key = "RemoteUser_" + uid; BottomSheetDialog dialog = cacheDialogs.get(key); - if(dialog != null){ + if (dialog != null) { dialog.show(); return; } @@ -398,7 +398,7 @@ private IMediaPlayer createLoopMediaPlayer() { IMediaPlayer mediaPlayer = engine.createMediaPlayer(); mediaPlayer.registerPlayerObserver(new IMediaPlayerObserver() { @Override - public void onPlayerStateChanged(Constants.MediaPlayerState state, Constants.MediaPlayerError error) { + public void onPlayerStateChanged(Constants.MediaPlayerState state, Constants.MediaPlayerReason reason) { if (state.equals(PLAYER_STATE_OPEN_COMPLETED)) { mediaPlayer.setLoopCount(-1); mediaPlayer.play(); @@ -406,7 +406,7 @@ public void onPlayerStateChanged(Constants.MediaPlayerState state, Constants.Med } @Override - public void onPositionChanged(long position_ms) { + public void onPositionChanged(long positionMs, long timestampMs) { } @@ -445,6 +445,16 @@ public void onPlayerInfoUpdated(PlayerUpdatedInfo info) { } + @Override + public void onPlayerCacheStats(CacheStatistics stats) { + + } + + @Override + public void onPlayerPlaybackStats(PlayerPlaybackStats stats) { + + } + @Override public void onAudioVolumeIndication(int volume) { @@ -465,22 +475,22 @@ private float[] getVoicePosition(View view) { float transY = view.getTranslationY(); double posForward = -1 * AXIS_MAX_DISTANCE * transY / ((rootView.getHeight()) / 2.0f); double posRight = AXIS_MAX_DISTANCE * transX / ((rootView.getWidth()) / 2.0f); - Log.d(TAG, "VoicePosition posForward=" + posForward + ", posRight=" + posRight); + //Log.d(TAG, "VoicePosition posForward=" + posForward + ", posRight=" + posRight); return new float[]{(float) posForward, (float) posRight, 0.0F}; } private float[] getViewRelativeSizeInAxis(View view) { return new float[]{ AXIS_MAX_DISTANCE * view.getWidth() * 1.0f / (rootView.getWidth() / 2.0f), - AXIS_MAX_DISTANCE * view.getHeight() * 1.0f / (rootView.getHeight() / 2.0f) , + AXIS_MAX_DISTANCE * view.getHeight() * 1.0f / (rootView.getHeight() / 2.0f), }; } private BottomSheetDialog showCommonSettingDialog(boolean isMute, SpatialAudioParams params, - CompoundButton.OnCheckedChangeListener muteCheckListener, - CompoundButton.OnCheckedChangeListener blurCheckListener, - CompoundButton.OnCheckedChangeListener airborneCheckListener, - SeekBar.OnSeekBarChangeListener attenuationSeekChangeListener + CompoundButton.OnCheckedChangeListener muteCheckListener, + CompoundButton.OnCheckedChangeListener blurCheckListener, + CompoundButton.OnCheckedChangeListener airborneCheckListener, + SeekBar.OnSeekBarChangeListener attenuationSeekChangeListener ) { BottomSheetDialog dialog = new BottomSheetDialog(requireContext()); View dialogView = LayoutInflater.from(requireContext()).inflate(R.layout.dialog_spatial_sound, null); @@ -571,12 +581,15 @@ public boolean onTouch(View v, MotionEvent event) { v.setTranslationY(newTranY); onPositionChanged(); break; - case MotionEvent.ACTION_UP: + default: break; } return true; } + /** + * On position changed. + */ protected abstract void onPositionChanged(); } @@ -584,7 +597,7 @@ public boolean onTouch(View v, MotionEvent event) { * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private class InnerRtcEngineEventHandler extends IRtcEngineEventHandler { + private final class InnerRtcEngineEventHandler extends IRtcEngineEventHandler { @Override public void onJoinChannelSuccess(String channel, int uid, int elapsed) { super.onJoinChannelSuccess(channel, uid, elapsed); @@ -607,8 +620,8 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) { /** * Error code description can be found at: - * en: https://api-ref.agora.io/en/voice-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror - * cn: https://docs.agora.io/cn/voice-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror + * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror + * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override public void onError(int err) { @@ -633,7 +646,9 @@ public void onUserJoined(int uid, int elapsed) { remoteLeftTv.setTag(uid); remoteLeftTv.setVisibility(View.VISIBLE); remoteLeftTv.setText(uid + ""); - localSpatial.updateRemotePosition(uid, getVoicePositionInfo(remoteLeftTv)); + RemoteVoicePositionInfo info = getVoicePositionInfo(remoteLeftTv); + Log.d(TAG, "left remote user >> pos=" + Arrays.toString(info.position)); + localSpatial.updateRemotePosition(uid, info); remoteLeftTv.setOnClickListener(v -> showRemoteUserSettingDialog(uid)); } else if (remoteRightTv.getTag() == null) { diff --git a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java index f88ea322d..479854f9a 100755 --- a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java +++ b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java @@ -6,7 +6,6 @@ import android.media.AudioFormat; import android.media.AudioManager; import android.os.Bundle; -import android.os.Handler; import android.os.Process; import android.util.Log; import android.view.LayoutInflater; @@ -41,20 +40,16 @@ /** * This demo demonstrates how to make a one-to-one voice call */ -@Example( - index = 6, - group = ADVANCED, - name = R.string.item_customaudiorender, - actionId = R.id.action_mainFragment_to_CustomAudioRender, - tipsId = R.string.customaudiorender -) +@Example(index = 6, group = ADVANCED, name = R.string.item_customaudiorender, actionId = R.id.action_mainFragment_to_CustomAudioRender, tipsId = R.string.customaudiorender) public class CustomAudioRender extends BaseFragment implements View.OnClickListener { private static final String TAG = CustomAudioRender.class.getSimpleName(); private EditText et_channel; private Button join; private boolean joined = false; + /** + * The constant engine. + */ public static RtcEngineEx engine; - private ChannelMediaOptions option = new ChannelMediaOptions(); private static final Integer SAMPLE_RATE = 44100; private static final Integer SAMPLE_NUM_OF_CHANNEL = 2; @@ -69,22 +64,6 @@ public class CustomAudioRender extends BaseFragment implements View.OnClickListe private AudioSeatManager audioSeatManager; - @Override - public void onCreate(@Nullable Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - handler = new Handler(); - initMediaOption(); - } - - private void initMediaOption() { - option.autoSubscribeAudio = true; - option.autoSubscribeVideo = true; - option.publishMicrophoneTrack = true; - option.publishCustomAudioTrack = false; - option.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; - option.enableAudioRecordingOrPlayout = true; - } - @Nullable @Override @@ -109,8 +88,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat view.findViewById(R.id.audio_place_06), view.findViewById(R.id.audio_place_07), view.findViewById(R.id.audio_place_08), - view.findViewById(R.id.audio_place_09) - ); + view.findViewById(R.id.audio_place_09)); } @Override @@ -123,30 +101,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -165,9 +143,9 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { engine.setLocalAccessPoint(localAccessPointConfiguration); } - engine.setExternalAudioSource(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL); - - audioPlayer = new AudioPlayer(AudioManager.STREAM_MUSIC, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, + audioPlayer = new AudioPlayer(AudioManager.STREAM_MUSIC, + SAMPLE_RATE, + SAMPLE_NUM_OF_CHANNEL, AudioFormat.ENCODING_PCM_16BIT); } catch (Exception e) { e.printStackTrace(); @@ -179,7 +157,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { public void onDestroy() { super.onDestroy(); pulling = false; - if(pullingTask != null){ + if (pullingTask != null) { try { pullingTask.join(); pullingTask = null; @@ -188,7 +166,7 @@ public void onDestroy() { } } audioPlayer.stopPlayer(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -197,7 +175,6 @@ public void onDestroy() { } - @Override public void onClick(View v) { if (v.getId() == R.id.btn_join) { @@ -211,17 +188,13 @@ public void onClick(View v) { return; } // Request permission - AndPermission.with(this).runtime().permission( - Permission.Group.STORAGE, - Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + AndPermission.with(this).runtime().permission(Permission.Group.STORAGE, Permission.Group.MICROPHONE).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -242,7 +215,7 @@ public void onClick(View v) { pulling = false; join.setText(getString(R.string.join)); audioSeatManager.downAllSeats(); - if(pullingTask != null){ + if (pullingTask != null) { try { pullingTask.join(); pullingTask = null; @@ -259,35 +232,25 @@ public void onClick(View v) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ - engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); - /**Sets the external audio source. - * @param enabled Sets whether to enable/disable the external audio source: - * true: Enable the external audio source. - * false: (Default) Disable the external audio source. - * @param sampleRate Sets the sample rate (Hz) of the external audio source, which can be - * set as 8000, 16000, 32000, 44100, or 48000 Hz. - * @param channels Sets the number of channels of the external audio source: - * 1: Mono. - * 2: Stereo. - * @return - * 0: Success. - * < 0: Failure. - * PS: Ensure that you call this method before the joinChannel method.*/ - // engine.setExternalAudioSource(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, 2, false, true); - - - - /**Please configure accessToken in the string_config file. + + engine.setExternalAudioSink(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL); + + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + ChannelMediaOptions option = new ChannelMediaOptions(); + option.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; + option.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; + + + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: @@ -337,7 +300,7 @@ public void run() { join.setText(getString(R.string.leave)); pulling = true; audioPlayer.startPlayer(); - if(pullingTask == null){ + if (pullingTask == null) { pullingTask = new Thread(new PullingTask()); pullingTask.start(); } @@ -358,7 +321,13 @@ public void onUserOffline(int uid, int reason) { } }; + /** + * The type Pulling task. + */ class PullingTask implements Runnable { + /** + * The Number. + */ long number = 0; @Override @@ -366,23 +335,18 @@ public void run() { Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO); while (pulling) { Log.i(TAG, "pushExternalAudioFrame times:" + number++); - long before = System.currentTimeMillis(); ByteBuffer frame = ByteBuffer.allocateDirect(BUFFER_SIZE); engine.pullPlaybackAudioFrame(frame, BUFFER_SIZE); byte[] data = new byte[frame.remaining()]; frame.get(data, 0, data.length); - audioPlayer.play(data, 0, BUFFER_SIZE); - long now = System.currentTimeMillis(); - long consuming = now - before; - if(consuming < PULL_INTERVAL){ - try { - Thread.sleep(PULL_INTERVAL - consuming); - } catch (InterruptedException e) { - Log.e(TAG, "PushingTask Interrupted"); - } + // simple audio filter + for (int i = 0; i < data.length; i++) { + data[i] = (byte) (data[i] + 5); } + + audioPlayer.play(data, 0, BUFFER_SIZE); } } } diff --git a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java index 89ca1b59f..f42034c23 100755 --- a/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java +++ b/Android/APIExample-Audio/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java @@ -2,9 +2,22 @@ import static io.agora.api.example.common.model.Examples.BASIC; +import android.app.Notification; +import android.app.NotificationChannel; +import android.app.NotificationManager; +import android.app.PendingIntent; +import android.app.Service; import android.content.Context; +import android.content.Intent; +import android.content.pm.ApplicationInfo; +import android.content.pm.ServiceInfo; +import android.graphics.Bitmap; +import android.graphics.BitmapFactory; +import android.os.Build; import android.os.Bundle; import android.os.Handler; +import android.os.IBinder; +import android.provider.Settings; import android.util.Log; import android.view.LayoutInflater; import android.view.View; @@ -18,13 +31,17 @@ import androidx.annotation.NonNull; import androidx.annotation.Nullable; +import androidx.appcompat.app.AlertDialog; +import androidx.core.app.NotificationManagerCompat; import com.yanzhenjie.permission.AndPermission; import com.yanzhenjie.permission.runtime.Permission; +import java.util.ArrayList; import java.util.LinkedHashMap; import java.util.Map; +import io.agora.api.example.MainActivity; import io.agora.api.example.MainApplication; import io.agora.api.example.R; import io.agora.api.example.annotation.Example; @@ -133,11 +150,11 @@ public void onNothingSelected(AdapterView parent) { audioRouteInput.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() { @Override public void onItemSelected(AdapterView parent, View view, int position, long id) { - if(!joined){ + if (!joined) { return; } - boolean isChatRoomMode = "CHATROOM".equals(audioScenarioInput.getSelectedItem()); - if (isChatRoomMode) { + boolean isCommunication = getString(R.string.channel_profile_communication).equals(channelProfileInput.getSelectedItem()); + if (isCommunication) { int route = Constants.AUDIO_ROUTE_EARPIECE; if (getString(R.string.audio_route_earpiece).equals(parent.getSelectedItem())) { route = Constants.AUDIO_ROUTE_EARPIECE; @@ -146,9 +163,7 @@ public void onItemSelected(AdapterView parent, View view, int position, long } else if (getString(R.string.audio_route_headset).equals(parent.getSelectedItem())) { route = Constants.AUDIO_ROUTE_HEADSET; } else if (getString(R.string.audio_route_headset_bluetooth).equals(parent.getSelectedItem())) { - route = Constants.AUDIO_ROUTE_HEADSETBLUETOOTH; - } else if (getString(R.string.audio_route_headset_typec).equals(parent.getSelectedItem())) { - route = Constants.AUDIO_ROUTE_USBDEVICE; + route = Constants.AUDIO_ROUTE_BLUETOOTH_DEVICE_HFP; } int ret = engine.setRouteInCommunicationMode(route); showShortToast("setRouteInCommunicationMode route=" + route + ", ret=" + ret); @@ -197,6 +212,26 @@ record = view.findViewById(R.id.recordingVol); view.findViewById(R.id.audio_place_05), view.findViewById(R.id.audio_place_06) ); + + if (savedInstanceState != null) { + joined = savedInstanceState.getBoolean("joined"); + if (joined) { + myUid = savedInstanceState.getInt("myUid"); + ArrayList seatRemoteUidList = savedInstanceState.getIntegerArrayList("seatRemoteUidList"); + mute.setEnabled(true); + join.setEnabled(true); + join.setText(getString(R.string.leave)); + record.setEnabled(true); + playout.setEnabled(true); + inear.setEnabled(inEarSwitch.isChecked()); + inEarSwitch.setEnabled(true); + audioSeatManager.upLocalSeat(myUid); + + for (Integer uid : seatRemoteUidList) { + audioSeatManager.upRemoteSeat(uid); + } + } + } } @Override @@ -204,27 +239,27 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) { + if (context == null || engine != null) { return; } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -232,7 +267,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.valueOf(audioScenarioInput.getSelectedItem().toString())); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -254,17 +289,81 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { e.printStackTrace(); getActivity().onBackPressed(); } + enableNotifications(); + } + + private void enableNotifications() { + if (NotificationManagerCompat.from(requireContext()).areNotificationsEnabled()) { + Log.d(TAG, "Notifications enable!"); + return; + } + Log.d(TAG, "Notifications not enable!"); + new AlertDialog.Builder(requireContext()) + .setTitle("Tip") + .setMessage(R.string.notifications_enable_tip) + .setPositiveButton(R.string.setting, (dialog, which) -> { + Intent intent = new Intent(); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { + intent.setAction(Settings.ACTION_APP_NOTIFICATION_SETTINGS); + intent.putExtra(Settings.EXTRA_APP_PACKAGE, requireContext().getPackageName()); + intent.putExtra(Settings.EXTRA_CHANNEL_ID, requireContext().getApplicationInfo().uid); + } else { + intent.setAction(Settings.ACTION_APPLICATION_DETAILS_SETTINGS); + } + startActivity(intent); + dialog.dismiss(); + }) + .show(); } @Override - public void onDestroy() { - super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + public void onPause() { + super.onPause(); + startRecordingService(); + } + + private void startRecordingService() { + if (joined) { + Intent intent = new Intent(requireContext(), LocalRecordingService.class); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { + requireContext().startForegroundService(intent); + } else { + requireContext().startService(intent); + } + } + } + + @Override + public void onSaveInstanceState(@NonNull Bundle outState) { + super.onSaveInstanceState(outState); + // join state + outState.putBoolean("joined", joined); + outState.putInt("myUid", myUid); + outState.putIntegerArrayList("seatRemoteUidList", audioSeatManager.getSeatRemoteUidList()); + } + + @Override + public void onResume() { + super.onResume(); + stopRecordingService(); + } + + private void stopRecordingService() { + Intent intent = new Intent(requireContext(), LocalRecordingService.class); + requireContext().stopService(intent); + } + + @Override + protected void onBackPressed() { + joined = false; + stopRecordingService(); + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); engine = null; + super.onBackPressed(); } @Override @@ -285,8 +384,7 @@ public void onClick(View v) { AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); audioProfileInput.setEnabled(false); @@ -294,7 +392,7 @@ public void onClick(View v) { }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -328,7 +426,7 @@ public void onClick(View v) { } else if (v.getId() == R.id.microphone) { mute.setActivated(!mute.isActivated()); mute.setText(getString(mute.isActivated() ? R.string.openmicrophone : R.string.closemicrophone)); - /**Turn off / on the microphone, stop / start local audio collection and push streaming.*/ + /*Turn off / on the microphone, stop / start local audio collection and push streaming.*/ engine.muteLocalAudioStream(mute.isActivated()); } } @@ -338,7 +436,7 @@ public void onClick(View v) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); int channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; @@ -364,14 +462,14 @@ private void joinChannel(String channelId) { option.autoSubscribeAudio = true; option.autoSubscribeVideo = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -426,18 +524,15 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) { showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() { - @Override - public void run() { - mute.setEnabled(true); - join.setEnabled(true); - join.setText(getString(R.string.leave)); - record.setEnabled(true); - playout.setEnabled(true); - inear.setEnabled(inEarSwitch.isChecked()); - inEarSwitch.setEnabled(true); - audioSeatManager.upLocalSeat(uid); - } + runOnUIThread(() -> { + mute.setEnabled(true); + join.setEnabled(true); + join.setText(getString(R.string.leave)); + record.setEnabled(true); + playout.setEnabled(true); + inear.setEnabled(inEarSwitch.isChecked()); + inEarSwitch.setEnabled(true); + audioSeatManager.upLocalSeat(uid); }); } @@ -544,22 +639,22 @@ public void onAudioRouteChanged(int routing) { showShortToast("onAudioRouteChanged : " + routing); runOnUIThread(() -> { String selectedRouteStr = getString(R.string.audio_route_speakerphone); - if(routing == Constants.AUDIO_ROUTE_EARPIECE){ + if (routing == Constants.AUDIO_ROUTE_EARPIECE) { selectedRouteStr = getString(R.string.audio_route_earpiece); - }else if(routing == Constants.AUDIO_ROUTE_SPEAKERPHONE){ + } else if (routing == Constants.AUDIO_ROUTE_SPEAKERPHONE) { selectedRouteStr = getString(R.string.audio_route_speakerphone); - }else if(routing == Constants.AUDIO_ROUTE_HEADSET){ + } else if (routing == Constants.AUDIO_ROUTE_HEADSET) { selectedRouteStr = getString(R.string.audio_route_headset); - }else if(routing == Constants.AUDIO_ROUTE_HEADSETBLUETOOTH){ + } else if (routing == Constants.AUDIO_ROUTE_BLUETOOTH_DEVICE_HFP) { selectedRouteStr = getString(R.string.audio_route_headset_bluetooth); - }else if(routing == Constants.AUDIO_ROUTE_USBDEVICE){ + } else if (routing == Constants.AUDIO_ROUTE_USBDEVICE) { selectedRouteStr = getString(R.string.audio_route_headset_typec); } int selection = 0; for (int i = 0; i < audioRouteInput.getAdapter().getCount(); i++) { String routeStr = (String) audioRouteInput.getItemAtPosition(i); - if(routeStr.equals(selectedRouteStr)){ + if (routeStr.equals(selectedRouteStr)) { selection = i; break; } @@ -568,4 +663,86 @@ public void onAudioRouteChanged(int routing) { }); } }; + + + /** + * The service will display a microphone foreground notification, + * which can ensure keeping recording when the activity destroyed by system for memory leak or other reasons. + * Note: The "android.permission.FOREGROUND_SERVICE" permission is required. + * And the android:foregroundServiceType should be microphone. + */ + public static class LocalRecordingService extends Service { + private static final int NOTIFICATION_ID = 1234567800; + private static final String CHANNEL_ID = "audio_channel_id"; + + + @Override + public void onCreate() { + super.onCreate(); + Notification notification = getDefaultNotification(); + + try { + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) { + this.startForeground(NOTIFICATION_ID, notification, ServiceInfo.FOREGROUND_SERVICE_TYPE_MICROPHONE); + } else { + this.startForeground(NOTIFICATION_ID, notification); + } + } catch (Exception ex) { + Log.e(TAG, "", ex); + } + } + + @Nullable + @Override + public IBinder onBind(Intent intent) { + return null; + } + + private Notification getDefaultNotification() { + ApplicationInfo appInfo = this.getApplicationContext().getApplicationInfo(); + String name = this.getApplicationContext().getPackageManager().getApplicationLabel(appInfo).toString(); + int icon = appInfo.icon; + + try { + Bitmap iconBitMap = BitmapFactory.decodeResource(this.getApplicationContext().getResources(), icon); + if (iconBitMap == null || iconBitMap.getByteCount() == 0) { + Log.w(TAG, "Couldn't load icon from icon of applicationInfo, use android default"); + icon = R.mipmap.ic_launcher; + } + } catch (Exception ex) { + Log.w(TAG, "Couldn't load icon from icon of applicationInfo, use android default"); + icon = R.mipmap.ic_launcher; + } + + if (Build.VERSION.SDK_INT >= 26) { + NotificationChannel mChannel = new NotificationChannel(CHANNEL_ID, name, NotificationManager.IMPORTANCE_DEFAULT); + NotificationManager mNotificationManager = (NotificationManager) this.getSystemService(Context.NOTIFICATION_SERVICE); + mNotificationManager.createNotificationChannel(mChannel); + } + + PendingIntent activityPendingIntent; + Intent intent = new Intent(); + intent.setClass(this, MainActivity.class); + if (Build.VERSION.SDK_INT >= 23) { + activityPendingIntent = PendingIntent.getActivity(this, 0, intent, PendingIntent.FLAG_ONE_SHOT | PendingIntent.FLAG_IMMUTABLE); + } else { + activityPendingIntent = PendingIntent.getActivity(this, 0, intent, PendingIntent.FLAG_ONE_SHOT); + } + + Notification.Builder builder = new Notification.Builder(this) + .addAction(icon, "Back to app", activityPendingIntent) + .setContentText("Agora Recording ...") + .setOngoing(true) + .setPriority(Notification.PRIORITY_HIGH) + .setSmallIcon(icon) + .setTicker(name) + .setWhen(System.currentTimeMillis()); + if (Build.VERSION.SDK_INT >= 26) { + builder.setChannelId(CHANNEL_ID); + } + + return builder.build(); + } + + } } diff --git a/Android/APIExample-Audio/app/src/main/res/layout/fragment_spatial_sound.xml b/Android/APIExample-Audio/app/src/main/res/layout/fragment_spatial_sound.xml index a925cec4c..a016b69c9 100644 --- a/Android/APIExample-Audio/app/src/main/res/layout/fragment_spatial_sound.xml +++ b/Android/APIExample-Audio/app/src/main/res/layout/fragment_spatial_sound.xml @@ -66,8 +66,8 @@ 榛樿 鎵0鍣 鍚瓛 - 鑰虫満(闈濼ypeC) + 鑰虫満 鑰虫満(TypeC) 钃濈墮鑰虫満 + 璇锋墦寮閫氱煡鏉冮檺锛岄槻姝㈠悗鍙板綍闊充腑鏂 \ No newline at end of file diff --git a/Android/APIExample-Audio/app/src/main/res/values/arrays.xml b/Android/APIExample-Audio/app/src/main/res/values/arrays.xml index 7f5fb504f..d29902f2d 100644 --- a/Android/APIExample-Audio/app/src/main/res/values/arrays.xml +++ b/Android/APIExample-Audio/app/src/main/res/values/arrays.xml @@ -140,7 +140,6 @@ @string/audio_route_speakerphone @string/audio_route_earpiece @string/audio_route_headset - @string/audio_route_headset_typec @string/audio_route_headset_bluetooth diff --git a/Android/APIExample-Audio/app/src/main/res/values/strings.xml b/Android/APIExample-Audio/app/src/main/res/values/strings.xml index 7d8b407a6..bc36eba2c 100644 --- a/Android/APIExample-Audio/app/src/main/res/values/strings.xml +++ b/Android/APIExample-Audio/app/src/main/res/values/strings.xml @@ -90,7 +90,8 @@ default speakerphone earpiece - headset(Not TypeC) + headset headset(TypeC) bluetooth headset + Please turn on notification permission to prevent background recording from being interrupted. diff --git a/Android/APIExample/README.md b/Android/APIExample/README.md index e82e40e70..56c2c20f2 100644 --- a/Android/APIExample/README.md +++ b/Android/APIExample/README.md @@ -89,9 +89,36 @@ Since version 4.0.0, Agora SDK provides an Extension Interface Framework. Develo In order to enable it, you could do as follows: 1. Download [opencv](https://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/androidLibs/opencv4.zip) library, unzip it and copy into Android/APIExample/agora-simple-filter/src/main/jniLibs -2. Download [Agora SDK鍖匽(https://docs.agora.io/cn/video-call-4.x/downloads?platform=Android), unzip it and copy c++ .so library (keeps arch folder) to Android/APIExample/agora-simple-filter/src/main/agoraLibs +2. Download [Agora SDK鍖匽(https://doc.shengwang.cn/doc/rtc/android/resources), unzip it and copy c++ .so library (keeps arch folder) to Android/APIExample/agora-simple-filter/src/main/agoraLibs + +```text +Android/APIExample/agora-simple-filter/src/main/agoraLibs +鈹溾攢鈹 arm64-v8a +鈹溾攢鈹 armeabi-v7a +鈹溾攢鈹 x86 +鈹斺攢鈹 x86_64 +``` + 3. Modify simpleFilter to true in Android/APIExample/gradle.properties +### Stream Encrypt + +This project contains custom stream encrypt examples, which cannot be enabled by default. +The configuration method is as follows: + +1. Download [Agora SDK鍖匽(https://doc.shengwang.cn/doc/rtc/android/resources), unzip it and copy c++ .so library (keeps arch folder) to Android/APIExample/agora-stream-encrypt/src/main/agoraLibs + +```text +Android/APIExample/agora-stream-encrypt/src/main/agoraLibs +鈹溾攢鈹 arm64-v8a +鈹溾攢鈹 armeabi-v7a +鈹溾攢鈹 x86 +鈹斺攢鈹 x86_64 +``` + +2. Modify streamEncrypt to true in Android/APIExample/gradle.properties + + ## Contact Us - For potential issues, take a look at our [FAQ](https://docs.agora.io/en/faq) first diff --git a/Android/APIExample/README.zh.md b/Android/APIExample/README.zh.md index fa5fc68ca..876de706e 100644 --- a/Android/APIExample/README.zh.md +++ b/Android/APIExample/README.zh.md @@ -85,9 +85,34 @@ 浠4.0.0SDK寮濮嬶紝Agora SDK鏀寔鎻掍欢绯荤粺鍜屽紑鏀剧殑浜戝競鍦哄府鍔╁紑鍙戣呭彂甯冭嚜宸辩殑闊宠棰戞彃浠讹紝鏈」鐩寘鍚簡涓涓猄impleFilter绀轰緥锛岄粯璁ゆ槸绂佺敤鐨勭姸鎬侊紝濡傛灉闇瑕佸紑鍚紪璇戝拰浣跨敤闇瑕佸畬鎴愪互涓嬫楠わ細 1. 涓嬭浇 [opencv](https://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/androidLibs/opencv4.zip) 瑙e帇鍚庡鍒跺埌 Android/APIExample/agora-simple-filter/src/main/jniLibs -2. 鎵嬪姩涓嬭浇[Agora SDK鍖匽(https://docs.agora.io/cn/video-call-4.x/downloads?platform=Android), 瑙e帇鍚庡皢c++鍔ㄦ佸簱锛堝寘鎷灦鏋勬枃浠跺す锛塩opy鍒癆ndroid/APIExample/agora-simple-filter/src/main/agoraLibs +2. 鎵嬪姩涓嬭浇[Agora SDK鍖匽(https://doc.shengwang.cn/doc/rtc/android/resources), 瑙e帇鍚庡皢c++鍔ㄦ佸簱锛堝寘鎷灦鏋勬枃浠跺す锛塩opy鍒癆ndroid/APIExample/agora-simple-filter/src/main/agoraLibs + +```text +Android/APIExample/agora-simple-filter/src/main/agoraLibs +鈹溾攢鈹 arm64-v8a +鈹溾攢鈹 armeabi-v7a +鈹溾攢鈹 x86 +鈹斺攢鈹 x86_64 +``` + 3. 淇敼Android/APIExample/gradle.properties閰嶇疆鏂囦欢涓璼impleFilter鍊间负true +### 鑷畾涔夊姞瀵 + +鏈」鐩寘鍚嚜瀹氫箟鍔犲瘑绀轰緥锛岄粯璁ゆ槸涓嶅惎鐢ㄧ殑銆傞厤缃柟娉曞涓嬶細 + +1. 鎵嬪姩涓嬭浇[Agora SDK鍖匽(https://doc.shengwang.cn/doc/rtc/android/resources), 瑙e帇鍚庡皢c++鍔ㄦ佸簱锛堝寘鎷灦鏋勬枃浠跺す锛塩opy鍒癆ndroid/APIExample/agora-stream-encrypt/src/main/agoraLibs + +```text +Android/APIExample/agora-stream-encrypt/src/main/agoraLibs +鈹溾攢鈹 arm64-v8a +鈹溾攢鈹 armeabi-v7a +鈹溾攢鈹 x86 +鈹斺攢鈹 x86_64 +``` + +2. 淇敼Android/APIExample/gradle.properties閰嶇疆鏂囦欢涓璼treamEncrypt鍊间负true + ## 鑱旂郴鎴戜滑 - 濡傛灉浣犻亣鍒颁簡鍥伴毦锛屽彲浠ュ厛鍙傞槄 [甯歌闂](https://docs.agora.io/cn/faq) diff --git a/Android/APIExample/agora-simple-filter/build.gradle b/Android/APIExample/agora-simple-filter/build.gradle index c54d70f49..560e7feda 100644 --- a/Android/APIExample/agora-simple-filter/build.gradle +++ b/Android/APIExample/agora-simple-filter/build.gradle @@ -43,6 +43,6 @@ dependencies { api fileTree(dir: "libs", include: ["*.jar", "*.aar"]) implementation 'androidx.appcompat:appcompat:1.1.0' testImplementation 'junit:junit:4.12' - androidTestImplementation 'androidx.test.ext:junit:1.1.1' - androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0' + androidTestImplementation 'androidx.test.ext:junit:1.1.3' + androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0' } diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraBase.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraBase.h index 81580b034..a3b4647a6 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraBase.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraBase.h @@ -80,7 +80,7 @@ #endif #endif -#define INVALID_DISPLAY_ID 0xffff +#define INVALID_DISPLAY_ID (-2) namespace agora { namespace util { @@ -92,16 +92,16 @@ class AutoPtr { typedef T* pointer_type; public: - explicit AutoPtr(pointer_type p = NULL) : ptr_(p) {} + explicit AutoPtr(pointer_type p = OPTIONAL_NULLPTR) : ptr_(p) {} ~AutoPtr() { if (ptr_) { ptr_->release(); - ptr_ = NULL; + ptr_ = OPTIONAL_NULLPTR; } } - operator bool() const { return (ptr_ != NULL); } + operator bool() const { return (ptr_ != OPTIONAL_NULLPTR); } value_type& operator*() const { return *get(); } @@ -115,7 +115,7 @@ class AutoPtr { return ret; } - void reset(pointer_type ptr = NULL) { + void reset(pointer_type ptr = OPTIONAL_NULLPTR) { if (ptr != ptr_ && ptr_) { ptr_->release(); } @@ -125,12 +125,12 @@ class AutoPtr { template bool queryInterface(C1* c, C2 iid) { - pointer_type p = NULL; + pointer_type p = OPTIONAL_NULLPTR; if (c && !c->queryInterface(iid, reinterpret_cast(&p))) { reset(p); } - return (p != NULL); + return (p != OPTIONAL_NULLPTR); } private: @@ -153,7 +153,7 @@ class CopyableAutoPtr : public AutoPtr { return *this; } pointer_type clone() const { - if (!this->get()) return NULL; + if (!this->get()) return OPTIONAL_NULLPTR; return this->get()->clone(); } }; @@ -197,7 +197,7 @@ class AOutputIterator { typedef const value_type& const_reference; typedef value_type* pointer; typedef const value_type* const_pointer; - explicit AOutputIterator(IIterator* it = NULL) : p(it) {} + explicit AOutputIterator(IIterator* it = OPTIONAL_NULLPTR) : p(it) {} ~AOutputIterator() { if (p) p->release(); } @@ -215,7 +215,7 @@ class AOutputIterator { bool operator!=(const AOutputIterator& rhs) const { return !this->operator==(rhs); } reference operator*() { return *reinterpret_cast(p->current()); } const_reference operator*() const { return *reinterpret_cast(p->const_current()); } - bool valid() const { return p && p->current() != NULL; } + bool valid() const { return p && p->current() != OPTIONAL_NULLPTR; } }; template @@ -234,16 +234,16 @@ class AList { typedef const AOutputIterator const_iterator; public: - AList() : container(NULL), owner(false) {} + AList() : container(OPTIONAL_NULLPTR), owner(false) {} AList(IContainer* c, bool take_ownership) : container(c), owner(take_ownership) {} ~AList() { reset(); } - void reset(IContainer* c = NULL, bool take_ownership = false) { + void reset(IContainer* c = OPTIONAL_NULLPTR, bool take_ownership = false) { if (owner && container) container->release(); container = c; owner = take_ownership; } - iterator begin() { return container ? iterator(container->begin()) : iterator(NULL); } - iterator end() { return iterator(NULL); } + iterator begin() { return container ? iterator(container->begin()) : iterator(OPTIONAL_NULLPTR); } + iterator end() { return iterator(OPTIONAL_NULLPTR); } size_type size() const { return container ? container->size() : 0; } bool empty() const { return size() == 0; } }; @@ -731,10 +731,6 @@ enum ERROR_CODE_TYPE { * 1501: Video Device Module: The camera is not authorized. */ ERR_VDM_CAMERA_NOT_AUTHORIZED = 1501, - /** - * 2007: Audio Device Module: An error occurs in starting the application loopback. - */ - ERR_ADM_APPLICATION_LOOPBACK = 2007, }; enum LICENSE_ERROR_TYPE { @@ -857,9 +853,9 @@ enum INTERFACE_ID_TYPE { AGORA_IID_CLOUD_SPATIAL_AUDIO = 10, AGORA_IID_LOCAL_SPATIAL_AUDIO = 11, AGORA_IID_STATE_SYNC = 13, - AGORA_IID_METACHAT_SERVICE = 14, + AGORA_IID_META_SERVICE = 14, AGORA_IID_MUSIC_CONTENT_CENTER = 15, - AGORA_IID_H265_TRANSCODER = 16, + AGORA_IID_H265_TRANSCODER = 16, }; /** @@ -1267,7 +1263,7 @@ struct SenderOptions { SenderOptions() : ccMode(CC_ENABLED), - codecType(VIDEO_CODEC_H264), + codecType(VIDEO_CODEC_H265), targetBitrate(6500) {} }; @@ -1542,12 +1538,23 @@ struct VideoSubscriptionOptions { VideoSubscriptionOptions() {} }; + +/** The maximum length of the user account. + */ +enum MAX_USER_ACCOUNT_LENGTH_TYPE +{ + /** The maximum length of the user account is 256 bytes. + */ + MAX_USER_ACCOUNT_LENGTH = 256 +}; + /** * The definition of the EncodedVideoFrameInfo struct, which contains the information of the external encoded video frame. */ struct EncodedVideoFrameInfo { EncodedVideoFrameInfo() - : codecType(VIDEO_CODEC_H264), + : uid(0), + codecType(VIDEO_CODEC_H264), width(0), height(0), framesPerSecond(0), @@ -1556,11 +1563,11 @@ struct EncodedVideoFrameInfo { trackId(0), captureTimeMs(0), decodeTimeMs(0), - uid(0), streamType(VIDEO_STREAM_HIGH) {} EncodedVideoFrameInfo(const EncodedVideoFrameInfo& rhs) - : codecType(rhs.codecType), + : uid(rhs.uid), + codecType(rhs.codecType), width(rhs.width), height(rhs.height), framesPerSecond(rhs.framesPerSecond), @@ -1569,11 +1576,11 @@ struct EncodedVideoFrameInfo { trackId(rhs.trackId), captureTimeMs(rhs.captureTimeMs), decodeTimeMs(rhs.decodeTimeMs), - uid(rhs.uid), streamType(rhs.streamType) {} EncodedVideoFrameInfo& operator=(const EncodedVideoFrameInfo& rhs) { if (this == &rhs) return *this; + uid = rhs.uid; codecType = rhs.codecType; width = rhs.width; height = rhs.height; @@ -1583,12 +1590,16 @@ struct EncodedVideoFrameInfo { trackId = rhs.trackId; captureTimeMs = rhs.captureTimeMs; decodeTimeMs = rhs.decodeTimeMs; - uid = rhs.uid; streamType = rhs.streamType; return *this; } + + /** + * ID of the user that pushes the the external encoded video frame.. + */ + uid_t uid; /** - * The codec type of the local video stream. See #VIDEO_CODEC_TYPE. The default value is `VIDEO_CODEC_H264 (2)`. + * The codec type of the local video stream. See #VIDEO_CODEC_TYPE. The default value is `VIDEO_CODEC_H265 (3)`. */ VIDEO_CODEC_TYPE codecType; /** @@ -1626,16 +1637,13 @@ struct EncodedVideoFrameInfo { * The timestamp for decoding the video. */ int64_t decodeTimeMs; - /** - * ID of the user that pushes the the external encoded video frame.. - */ - uid_t uid; /** * The stream type of video frame. */ VIDEO_STREAM_TYPE streamType; }; + /** * Video compression preference. */ @@ -1750,6 +1758,8 @@ struct CodecCapInfo { int codecCapMask; /** The codec capability level, estimated based on the device hardware.*/ CodecCapLevels codecLevels; + + CodecCapInfo(): codecType(VIDEO_CODEC_NONE), codecCapMask(0) {} }; /** @@ -1863,7 +1873,7 @@ struct VideoEncoderConfiguration { AdvanceOptions advanceOptions; VideoEncoderConfiguration(const VideoDimensions& d, int f, int b, ORIENTATION_MODE m, VIDEO_MIRROR_MODE_TYPE mirror = VIDEO_MIRROR_MODE_DISABLED) - : codecType(VIDEO_CODEC_H264), + : codecType(VIDEO_CODEC_H265), dimensions(d), frameRate(f), bitrate(b), @@ -1873,7 +1883,7 @@ struct VideoEncoderConfiguration { mirrorMode(mirror), advanceOptions(PREFER_AUTO, PREFER_LOW_LATENCY) {} VideoEncoderConfiguration(int width, int height, int f, int b, ORIENTATION_MODE m, VIDEO_MIRROR_MODE_TYPE mirror = VIDEO_MIRROR_MODE_DISABLED) - : codecType(VIDEO_CODEC_H264), + : codecType(VIDEO_CODEC_H265), dimensions(width, height), frameRate(f), bitrate(b), @@ -1893,7 +1903,7 @@ struct VideoEncoderConfiguration { mirrorMode(config.mirrorMode), advanceOptions(config.advanceOptions) {} VideoEncoderConfiguration() - : codecType(VIDEO_CODEC_H264), + : codecType(VIDEO_CODEC_H265), dimensions(FRAME_WIDTH_960, FRAME_HEIGHT_540), frameRate(FRAME_RATE_FPS_15), bitrate(STANDARD_BITRATE), @@ -2583,6 +2593,29 @@ enum VIDEO_APPLICATION_SCENARIO_TYPE { APPLICATION_SCENARIO_MEETING = 1, }; +/** + * The video QoE preference type. + */ +enum VIDEO_QOE_PREFERENCE_TYPE { + /** + * 1: Default QoE type, balance the delay, picture quality and fluency. + */ + VIDEO_QOE_PREFERENCE_BALANCE = 1, + /** + * 2: lower the e2e delay. + */ + VIDEO_QOE_PREFERENCE_DELAY_FIRST = 2, + /** + * 3: picture quality. + */ + VIDEO_QOE_PREFERENCE_PICTURE_QUALITY_FIRST = 3, + /** + * 4: more fluency. + */ + VIDEO_QOE_PREFERENCE_FLUENCY_FIRST = 4, + +}; + /** * The brightness level of the video image captured by the local camera. */ @@ -2627,50 +2660,50 @@ enum LOCAL_AUDIO_STREAM_STATE { /** * Local audio state error codes. */ -enum LOCAL_AUDIO_STREAM_ERROR { +enum LOCAL_AUDIO_STREAM_REASON { /** * 0: The local audio is normal. */ - LOCAL_AUDIO_STREAM_ERROR_OK = 0, + LOCAL_AUDIO_STREAM_REASON_OK = 0, /** * 1: No specified reason for the local audio failure. Remind your users to try to rejoin the channel. */ - LOCAL_AUDIO_STREAM_ERROR_FAILURE = 1, + LOCAL_AUDIO_STREAM_REASON_FAILURE = 1, /** * 2: No permission to use the local audio device. Remind your users to grant permission. */ - LOCAL_AUDIO_STREAM_ERROR_DEVICE_NO_PERMISSION = 2, + LOCAL_AUDIO_STREAM_REASON_DEVICE_NO_PERMISSION = 2, /** * 3: (Android and iOS only) The local audio capture device is used. Remind your users to check * whether another application occupies the microphone. Local audio capture automatically resume * after the microphone is idle for about five seconds. You can also try to rejoin the channel * after the microphone is idle. */ - LOCAL_AUDIO_STREAM_ERROR_DEVICE_BUSY = 3, + LOCAL_AUDIO_STREAM_REASON_DEVICE_BUSY = 3, /** * 4: The local audio capture failed. */ - LOCAL_AUDIO_STREAM_ERROR_RECORD_FAILURE = 4, + LOCAL_AUDIO_STREAM_REASON_RECORD_FAILURE = 4, /** * 5: The local audio encoding failed. */ - LOCAL_AUDIO_STREAM_ERROR_ENCODE_FAILURE = 5, + LOCAL_AUDIO_STREAM_REASON_ENCODE_FAILURE = 5, /** 6: The SDK cannot find the local audio recording device. */ - LOCAL_AUDIO_STREAM_ERROR_NO_RECORDING_DEVICE = 6, + LOCAL_AUDIO_STREAM_REASON_NO_RECORDING_DEVICE = 6, /** 7: The SDK cannot find the local audio playback device. */ - LOCAL_AUDIO_STREAM_ERROR_NO_PLAYOUT_DEVICE = 7, + LOCAL_AUDIO_STREAM_REASON_NO_PLAYOUT_DEVICE = 7, /** * 8: The local audio capturing is interrupted by the system call. */ - LOCAL_AUDIO_STREAM_ERROR_INTERRUPTED = 8, + LOCAL_AUDIO_STREAM_REASON_INTERRUPTED = 8, /** 9: An invalid audio capture device ID. */ - LOCAL_AUDIO_STREAM_ERROR_RECORD_INVALID_ID = 9, + LOCAL_AUDIO_STREAM_REASON_RECORD_INVALID_ID = 9, /** 10: An invalid audio playback device ID. */ - LOCAL_AUDIO_STREAM_ERROR_PLAYOUT_INVALID_ID = 10, + LOCAL_AUDIO_STREAM_REASON_PLAYOUT_INVALID_ID = 10, }; /** Local video state types. @@ -2698,73 +2731,73 @@ enum LOCAL_VIDEO_STREAM_STATE { /** * Local video state error codes. */ -enum LOCAL_VIDEO_STREAM_ERROR { +enum LOCAL_VIDEO_STREAM_REASON { /** * 0: The local video is normal. */ - LOCAL_VIDEO_STREAM_ERROR_OK = 0, + LOCAL_VIDEO_STREAM_REASON_OK = 0, /** * 1: No specified reason for the local video failure. */ - LOCAL_VIDEO_STREAM_ERROR_FAILURE = 1, + LOCAL_VIDEO_STREAM_REASON_FAILURE = 1, /** * 2: No permission to use the local video capturing device. Remind the user to grant permission * and rejoin the channel. */ - LOCAL_VIDEO_STREAM_ERROR_DEVICE_NO_PERMISSION = 2, + LOCAL_VIDEO_STREAM_REASON_DEVICE_NO_PERMISSION = 2, /** * 3: The local video capturing device is in use. Remind the user to check whether another * application occupies the camera. */ - LOCAL_VIDEO_STREAM_ERROR_DEVICE_BUSY = 3, + LOCAL_VIDEO_STREAM_REASON_DEVICE_BUSY = 3, /** * 4: The local video capture fails. Remind the user to check whether the video capture device * is working properly or the camera is occupied by another application, and then to rejoin the * channel. */ - LOCAL_VIDEO_STREAM_ERROR_CAPTURE_FAILURE = 4, + LOCAL_VIDEO_STREAM_REASON_CAPTURE_FAILURE = 4, /** * 5: The local video encoder is not supported. */ - LOCAL_VIDEO_STREAM_ERROR_ENCODE_FAILURE = 5, + LOCAL_VIDEO_STREAM_REASON_CODEC_NOT_SUPPORT = 5, /** * 6: (iOS only) The app is in the background. Remind the user that video capture cannot be * performed normally when the app is in the background. */ - LOCAL_VIDEO_STREAM_ERROR_CAPTURE_INBACKGROUND = 6, + LOCAL_VIDEO_STREAM_REASON_CAPTURE_INBACKGROUND = 6, /** * 7: (iOS only) The current application window is running in Slide Over, Split View, or Picture * in Picture mode, and another app is occupying the camera. Remind the user that the application * cannot capture video properly when the app is running in Slide Over, Split View, or Picture in * Picture mode and another app is occupying the camera. */ - LOCAL_VIDEO_STREAM_ERROR_CAPTURE_MULTIPLE_FOREGROUND_APPS = 7, + LOCAL_VIDEO_STREAM_REASON_CAPTURE_MULTIPLE_FOREGROUND_APPS = 7, /** * 8: Fails to find a local video capture device. Remind the user to check whether the camera is * connected to the device properly or the camera is working properly, and then to rejoin the * channel. */ - LOCAL_VIDEO_STREAM_ERROR_DEVICE_NOT_FOUND = 8, + LOCAL_VIDEO_STREAM_REASON_DEVICE_NOT_FOUND = 8, /** * 9: (macOS only) The video capture device currently in use is disconnected (such as being * unplugged). */ - LOCAL_VIDEO_STREAM_ERROR_DEVICE_DISCONNECTED = 9, + LOCAL_VIDEO_STREAM_REASON_DEVICE_DISCONNECTED = 9, /** * 10: (macOS and Windows only) The SDK cannot find the video device in the video device list. * Check whether the ID of the video device is valid. */ - LOCAL_VIDEO_STREAM_ERROR_DEVICE_INVALID_ID = 10, + LOCAL_VIDEO_STREAM_REASON_DEVICE_INVALID_ID = 10, /** * 101: The current video capture device is unavailable due to excessive system pressure. */ - LOCAL_VIDEO_STREAM_ERROR_DEVICE_SYSTEM_PRESSURE = 101, + LOCAL_VIDEO_STREAM_REASON_DEVICE_SYSTEM_PRESSURE = 101, /** * 11: (macOS only) The shared window is minimized when you call `startScreenCaptureByWindowId` * to share a window. The SDK cannot share a minimized window. You can cancel the minimization * of this window at the application layer, for example by maximizing this window. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_MINIMIZED = 11, + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_MINIMIZED = 11, /** * 12: (macOS and Windows only) The error code indicates that a window shared by the window ID * has been closed or a full-screen window shared by the window ID has exited full-screen mode. @@ -2779,31 +2812,38 @@ enum LOCAL_VIDEO_STREAM_ERROR { * then shares the window of the web video or document. After the user exits full-screen mode, * the SDK reports this error code. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_CLOSED = 12, + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_CLOSED = 12, /** 13: The local screen capture window is occluded. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_OCCLUDED = 13, + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_OCCLUDED = 13, /** 20: The local screen capture window is not supported. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_NOT_SUPPORTED = 20, + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_NOT_SUPPORTED = 20, /** 21: The screen capture fails. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_FAILURE = 21, + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_FAILURE = 21, /** 22: No permision to capture screen. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_NO_PERMISSION = 22, - /** - * 23: The screen capture paused. + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_NO_PERMISSION = 22, + /** + * 24: (Windows Only) An unexpected error (possibly due to window block failure) occurs during the screen + * sharing process, resulting in performance degradation. However, the screen sharing process itself is + * functioning normally. + */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_AUTO_FALLBACK = 24, + /** 25: (Windows only) The local screen capture window is currently hidden and not visible on the desktop. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_HIDDEN = 25, + /** 26: (Windows only) The local screen capture window is recovered from its hidden state. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_RECOVER_FROM_HIDDEN = 26, + /** 27:(Windows only) The window is recovered from miniminzed */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_RECOVER_FROM_MINIMIZED = 27, + /** + * 28: The screen capture paused. * * Common scenarios for reporting this error code: * - When the desktop switch to the secure desktop such as UAC dialog or the Winlogon desktop on * Windows platform, the SDK reports this error code. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_PAUSED = 23, - /** 24: The screen capture is resumed. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_RESUMED = 24, - /** 25: (Windows only) The local screen capture window is currently hidden and not visible on the desktop. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_HIDDEN = 25, - /** 26: (Windows only) The local screen capture window is recovered from its hidden state. */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_RECOVER_FROM_HIDDEN = 26, - /** 27:(Windows only) The window is recovered from miniminzed */ - LOCAL_VIDEO_STREAM_ERROR_SCREEN_CAPTURE_WINDOW_RECOVER_FROM_MINIMIZED = 27, + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_PAUSED = 28, + /** 29: The screen capture is resumed. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_RESUMED = 29, + }; /** @@ -2991,7 +3031,6 @@ enum REMOTE_USER_STATE { * The remote user has enabled the local video capturing. */ USER_STATE_ENABLE_LOCAL_VIDEO = (1 << 8), - }; /** @@ -3001,7 +3040,7 @@ enum REMOTE_USER_STATE { struct VideoTrackInfo { VideoTrackInfo() : isLocal(false), ownerUid(0), trackId(0), channelId(OPTIONAL_NULLPTR) - , streamType(VIDEO_STREAM_HIGH), codecType(VIDEO_CODEC_H264) + , streamType(VIDEO_STREAM_HIGH), codecType(VIDEO_CODEC_H265) , encodedFrameOnly(false), sourceType(VIDEO_SOURCE_CAMERA_PRIMARY) , observationPosition(agora::media::base::POSITION_POST_CAPTURER) {} /** @@ -3014,7 +3053,6 @@ struct VideoTrackInfo { * ID of the user who publishes the video track. */ uid_t ownerUid; - /** * ID of the video track. */ @@ -3145,7 +3183,7 @@ class IPacketObserver { */ unsigned int size; - Packet() : buffer(NULL), size(0) {} + Packet() : buffer(OPTIONAL_NULLPTR), size(0) {} }; /** * Occurs when the SDK is ready to send the audio packet. @@ -3278,6 +3316,18 @@ struct LocalAudioStats * The audio delay of the device, contains record and playout delay */ int audioDeviceDelay; + /** + * The playout delay of the device + */ + int audioPlayoutDelay; + /** + * The signal delay estimated from audio in-ear monitoring (ms). + */ + int earMonitorDelay; + /** + * The signal delay estimated during the AEC process from nearin and farin (ms). + */ + int aecEstimatedDelay; }; @@ -3316,74 +3366,74 @@ enum RTMP_STREAM_PUBLISH_STATE { /** * Error codes of the RTMP or RTMPS streaming. */ -enum RTMP_STREAM_PUBLISH_ERROR_TYPE { +enum RTMP_STREAM_PUBLISH_REASON { /** * 0: The RTMP or RTMPS streaming publishes successfully. */ - RTMP_STREAM_PUBLISH_ERROR_OK = 0, + RTMP_STREAM_PUBLISH_REASON_OK = 0, /** * 1: Invalid argument used. If, for example, you do not call the `setLiveTranscoding` method to configure the LiveTranscoding parameters before calling the addPublishStreamUrl method, * the SDK returns this error. Check whether you set the parameters in the `setLiveTranscoding` method properly. */ - RTMP_STREAM_PUBLISH_ERROR_INVALID_ARGUMENT = 1, + RTMP_STREAM_PUBLISH_REASON_INVALID_ARGUMENT = 1, /** * 2: The RTMP or RTMPS streaming is encrypted and cannot be published. */ - RTMP_STREAM_PUBLISH_ERROR_ENCRYPTED_STREAM_NOT_ALLOWED = 2, + RTMP_STREAM_PUBLISH_REASON_ENCRYPTED_STREAM_NOT_ALLOWED = 2, /** * 3: Timeout for the RTMP or RTMPS streaming. Call the `addPublishStreamUrl` method to publish the streaming again. */ - RTMP_STREAM_PUBLISH_ERROR_CONNECTION_TIMEOUT = 3, + RTMP_STREAM_PUBLISH_REASON_CONNECTION_TIMEOUT = 3, /** * 4: An error occurs in Agora's streaming server. Call the `addPublishStreamUrl` method to publish the streaming again. */ - RTMP_STREAM_PUBLISH_ERROR_INTERNAL_SERVER_ERROR = 4, + RTMP_STREAM_PUBLISH_REASON_INTERNAL_SERVER_ERROR = 4, /** * 5: An error occurs in the CDN server. */ - RTMP_STREAM_PUBLISH_ERROR_RTMP_SERVER_ERROR = 5, + RTMP_STREAM_PUBLISH_REASON_RTMP_SERVER_ERROR = 5, /** * 6: The RTMP or RTMPS streaming publishes too frequently. */ - RTMP_STREAM_PUBLISH_ERROR_TOO_OFTEN = 6, + RTMP_STREAM_PUBLISH_REASON_TOO_OFTEN = 6, /** * 7: The host publishes more than 10 URLs. Delete the unnecessary URLs before adding new ones. */ - RTMP_STREAM_PUBLISH_ERROR_REACH_LIMIT = 7, + RTMP_STREAM_PUBLISH_REASON_REACH_LIMIT = 7, /** * 8: The host manipulates other hosts' URLs. Check your app logic. */ - RTMP_STREAM_PUBLISH_ERROR_NOT_AUTHORIZED = 8, + RTMP_STREAM_PUBLISH_REASON_NOT_AUTHORIZED = 8, /** * 9: Agora's server fails to find the RTMP or RTMPS streaming. */ - RTMP_STREAM_PUBLISH_ERROR_STREAM_NOT_FOUND = 9, + RTMP_STREAM_PUBLISH_REASON_STREAM_NOT_FOUND = 9, /** * 10: The format of the RTMP or RTMPS streaming URL is not supported. Check whether the URL format is correct. */ - RTMP_STREAM_PUBLISH_ERROR_FORMAT_NOT_SUPPORTED = 10, + RTMP_STREAM_PUBLISH_REASON_FORMAT_NOT_SUPPORTED = 10, /** * 11: The user role is not host, so the user cannot use the CDN live streaming function. Check your application code logic. */ - RTMP_STREAM_PUBLISH_ERROR_NOT_BROADCASTER = 11, // Note: match to ERR_PUBLISH_STREAM_NOT_BROADCASTER in AgoraBase.h + RTMP_STREAM_PUBLISH_REASON_NOT_BROADCASTER = 11, // Note: match to ERR_PUBLISH_STREAM_NOT_BROADCASTER in AgoraBase.h /** * 13: The `updateRtmpTranscoding` or `setLiveTranscoding` method is called to update the transcoding configuration in a scenario where there is streaming without transcoding. Check your application code logic. */ - RTMP_STREAM_PUBLISH_ERROR_TRANSCODING_NO_MIX_STREAM = 13, // Note: match to ERR_PUBLISH_STREAM_TRANSCODING_NO_MIX_STREAM in AgoraBase.h + RTMP_STREAM_PUBLISH_REASON_TRANSCODING_NO_MIX_STREAM = 13, // Note: match to ERR_PUBLISH_STREAM_TRANSCODING_NO_MIX_STREAM in AgoraBase.h /** * 14: Errors occurred in the host's network. */ - RTMP_STREAM_PUBLISH_ERROR_NET_DOWN = 14, // Note: match to ERR_NET_DOWN in AgoraBase.h + RTMP_STREAM_PUBLISH_REASON_NET_DOWN = 14, // Note: match to ERR_NET_DOWN in AgoraBase.h /** * 15: Your App ID does not have permission to use the CDN live streaming function. */ - RTMP_STREAM_PUBLISH_ERROR_INVALID_APPID = 15, // Note: match to ERR_PUBLISH_STREAM_APPID_INVALID in AgoraBase.h + RTMP_STREAM_PUBLISH_REASON_INVALID_APPID = 15, // Note: match to ERR_PUBLISH_STREAM_APPID_INVALID in AgoraBase.h /** invalid privilege. */ - RTMP_STREAM_PUBLISH_ERROR_INVALID_PRIVILEGE = 16, + RTMP_STREAM_PUBLISH_REASON_INVALID_PRIVILEGE = 16, /** * 100: The streaming has been stopped normally. After you call `removePublishStreamUrl` to stop streaming, the SDK returns this value. */ - RTMP_STREAM_UNPUBLISH_ERROR_OK = 100, + RTMP_STREAM_UNPUBLISH_REASON_OK = 100, }; /** Events during the RTMP or RTMPS streaming. */ @@ -3445,7 +3495,7 @@ typedef struct RtcImage { */ double alpha; - RtcImage() : url(NULL), x(0), y(0), width(0), height(0), zOrder(0), alpha(1.0) {} + RtcImage() : url(OPTIONAL_NULLPTR), x(0), y(0), width(0), height(0), zOrder(0), alpha(1.0) {} } RtcImage; /** * The configuration for advanced features of the RTMP or RTMPS streaming with transcoding. @@ -3453,7 +3503,7 @@ typedef struct RtcImage { * If you want to enable the advanced features of streaming with transcoding, contact support@agora.io. */ struct LiveStreamAdvancedFeature { - LiveStreamAdvancedFeature() : featureName(NULL), opened(false) {} + LiveStreamAdvancedFeature() : featureName(OPTIONAL_NULLPTR), opened(false) {} LiveStreamAdvancedFeature(const char* feat_name, bool open) : featureName(feat_name), opened(open) {} /** The advanced feature for high-quality video with a lower bitrate. */ // static const char* LBHQ = "lbhq"; @@ -3573,6 +3623,7 @@ struct TranscodingUser { * @note If the value is not `0`, a special player is required. */ int audioChannel; + TranscodingUser() : uid(0), x(0), @@ -3692,7 +3743,31 @@ struct LiveTranscoding { /** The number of enabled advanced features. The default value is 0. */ unsigned int advancedFeatureCount; - LiveTranscoding() : width(360), height(640), videoBitrate(400), videoFramerate(15), lowLatency(false), videoGop(30), videoCodecProfile(VIDEO_CODEC_PROFILE_HIGH), backgroundColor(0x000000), videoCodecType(VIDEO_CODEC_H264_FOR_STREAM), userCount(0), transcodingUsers(NULL), transcodingExtraInfo(NULL), metadata(NULL), watermark(NULL), watermarkCount(0), backgroundImage(NULL), backgroundImageCount(0), audioSampleRate(AUDIO_SAMPLE_RATE_48000), audioBitrate(48), audioChannels(1), audioCodecProfile(AUDIO_CODEC_PROFILE_LC_AAC), advancedFeatures(NULL), advancedFeatureCount(0) {} + + LiveTranscoding() + : width(360), + height(640), + videoBitrate(400), + videoFramerate(15), + lowLatency(false), + videoGop(30), + videoCodecProfile(VIDEO_CODEC_PROFILE_HIGH), + backgroundColor(0x000000), + videoCodecType(VIDEO_CODEC_H264_FOR_STREAM), + userCount(0), + transcodingUsers(OPTIONAL_NULLPTR), + transcodingExtraInfo(OPTIONAL_NULLPTR), + metadata(OPTIONAL_NULLPTR), + watermark(OPTIONAL_NULLPTR), + watermarkCount(0), + backgroundImage(OPTIONAL_NULLPTR), + backgroundImageCount(0), + audioSampleRate(AUDIO_SAMPLE_RATE_48000), + audioBitrate(48), + audioChannels(1), + audioCodecProfile(AUDIO_CODEC_PROFILE_LC_AAC), + advancedFeatures(OPTIONAL_NULLPTR), + advancedFeatureCount(0) {} }; /** @@ -3754,7 +3829,7 @@ struct TranscodingVideoStream { TranscodingVideoStream() : sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), remoteUserUid(0), - imageUrl(NULL), + imageUrl(OPTIONAL_NULLPTR), x(0), y(0), width(0), @@ -3764,7 +3839,6 @@ struct TranscodingVideoStream { mirror(false) {} }; - /** * The configuration of the video mixing on the local client. */ @@ -3788,11 +3862,7 @@ struct LocalTranscoderConfiguration { */ bool syncWithPrimaryCamera; - LocalTranscoderConfiguration() - : streamCount(0), - videoInputStreams(NULL), - videoOutputConfiguration(), - syncWithPrimaryCamera(true) {} + LocalTranscoderConfiguration() : streamCount(0), videoInputStreams(OPTIONAL_NULLPTR), videoOutputConfiguration(), syncWithPrimaryCamera(true) {} }; enum VIDEO_TRANSCODER_ERROR { @@ -4014,10 +4084,18 @@ enum CONNECTION_CHANGED_REASON_TYPE * 21: The connection is failed due to license validation failure. */ CONNECTION_CHANGED_LICENSE_VALIDATION_FAILURE = 21, - /** + /* * 22: The connection is failed due to certification verify failure. */ CONNECTION_CHANGED_CERTIFICATION_VERYFY_FAILURE = 22, + /** + * 23: The connection is failed due to the lack of granting permission to the stream channel. + */ + CONNECTION_CHANGED_STREAM_CHANNEL_NOT_AVAILABLE = 23, + /** + * 24: The connection is failed due to join channel with an inconsistent appid. + */ + CONNECTION_CHANGED_INCONSISTENT_APPID = 24, }; /** @@ -4156,14 +4234,19 @@ enum VIDEO_VIEW_SETUP_MODE { * Attributes of video canvas object. */ struct VideoCanvas { - /** - * Video display window. - */ - view_t view; /** * The user id of local video. */ uid_t uid; + + /** + * The uid of video stream composing the video stream from transcoder which will be drawn on this video canvas. + */ + uid_t subviewUid; + /** + * Video display window. + */ + view_t view; /** * A RGBA value indicates background color of the render view. Defaults to 0x00000000. */ @@ -4205,28 +4288,37 @@ struct VideoCanvas { * The default value is empty(that is, if it has zero width or height), which means no cropping. */ Rectangle cropArea; - /** - * Whether to apply alpha mask to the video frame if exsit: - * true: Apply alpha mask to video frame. - * false: (Default) Do not apply alpha mask to video frame. - */ + * Whether to apply alpha mask to the video frame if exsit: + * true: Apply alpha mask to video frame. + * false: (Default) Do not apply alpha mask to video frame. + */ bool enableAlphaMask; - + /** + * The video frame position in pipeline. See \ref VIDEO_MODULE_POSITION "VIDEO_MODULE_POSITION". + * The default value is POSITION_POST_CAPTURER. + */ + media::base::VIDEO_MODULE_POSITION position; + VideoCanvas() - : view(NULL), uid(0), backgroundColor(0x00000000), renderMode(media::base::RENDER_MODE_HIDDEN), mirrorMode(VIDEO_MIRROR_MODE_AUTO), + : uid(0), subviewUid(0), view(NULL), backgroundColor(0x00000000), renderMode(media::base::RENDER_MODE_HIDDEN), mirrorMode(VIDEO_MIRROR_MODE_AUTO), setupMode(VIDEO_VIEW_SETUP_REPLACE), sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), - cropArea(0, 0, 0, 0), enableAlphaMask(false) {} + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} + + VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt) + : uid(0), subviewUid(0), view(v), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), + sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt, uid_t u) - : view(v), uid(u), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), + : uid(u), subviewUid(0), view(v), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), - cropArea(0, 0, 0, 0), enableAlphaMask(false) {} + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} - VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt, user_id_t) - : view(v), uid(0), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), + VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt, uid_t u, uid_t subu) + : uid(u), subviewUid(subu), view(v), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), - cropArea(0, 0, 0, 0), enableAlphaMask(false) {} + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} }; /** Image enhancement options. @@ -4438,7 +4530,7 @@ struct VirtualBackgroundSource { */ BACKGROUND_BLUR_DEGREE blur_degree; - VirtualBackgroundSource() : background_source_type(BACKGROUND_COLOR), color(0xffffff), source(NULL), blur_degree(BLUR_DEGREE_HIGH) {} + VirtualBackgroundSource() : background_source_type(BACKGROUND_COLOR), color(0xffffff), source(OPTIONAL_NULLPTR), blur_degree(BLUR_DEGREE_HIGH) {} }; struct SegmentationProperty { @@ -4836,22 +4928,22 @@ struct ScreenCaptureParameters { */ int excludeWindowCount; - /** The width (px) of the border. Defaults to 0, and the value range is [0,50]. - * - */ - int highLightWidth; - /** The color of the border in RGBA format. The default value is 0xFF8CBF26. - * - */ - unsigned int highLightColor; - /** Whether to place a border around the shared window or screen: - * - true: Place a border. - * - false: (Default) Do not place a border. - * - * @note When you share a part of a window or screen, the SDK places a border around the entire window or screen if you set `enableHighLight` as true. - * - */ - bool enableHighLight; + /** The width (px) of the border. Defaults to 0, and the value range is [0,50]. + * + */ + int highLightWidth; + /** The color of the border in RGBA format. The default value is 0xFF8CBF26. + * + */ + unsigned int highLightColor; + /** Whether to place a border around the shared window or screen: + * - true: Place a border. + * - false: (Default) Do not place a border. + * + * @note When you share a part of a window or screen, the SDK places a border around the entire window or screen if you set `enableHighLight` as true. + * + */ + bool enableHighLight; ScreenCaptureParameters() : dimensions(1920, 1080), frameRate(5), bitrate(STANDARD_BITRATE), captureMouseCursor(true), windowFocus(false), excludeWindowList(OPTIONAL_NULLPTR), excludeWindowCount(0), highLightWidth(0), highLightColor(0), enableHighLight(false) {} @@ -4968,7 +5060,7 @@ struct AudioRecordingConfiguration { int recordingChannel; AudioRecordingConfiguration() - : filePath(NULL), + : filePath(OPTIONAL_NULLPTR), encode(false), sampleRate(32000), fileRecordingType(AUDIO_FILE_RECORDING_MIXED), @@ -5172,59 +5264,6 @@ enum CHANNEL_MEDIA_RELAY_ERROR { RELAY_ERROR_DEST_TOKEN_EXPIRED = 11, }; -/** - * The event code of channel media relay. - */ -enum CHANNEL_MEDIA_RELAY_EVENT { - /** 0: The user disconnects from the server due to poor network connections. - */ - RELAY_EVENT_NETWORK_DISCONNECTED = 0, - /** 1: The user is connected to the server. - */ - RELAY_EVENT_NETWORK_CONNECTED = 1, - /** 2: The user joins the source channel. - */ - RELAY_EVENT_PACKET_JOINED_SRC_CHANNEL = 2, - /** 3: The user joins the destination channel. - */ - RELAY_EVENT_PACKET_JOINED_DEST_CHANNEL = 3, - /** 4: The SDK starts relaying the media stream to the destination channel. - */ - RELAY_EVENT_PACKET_SENT_TO_DEST_CHANNEL = 4, - /** 5: The server receives the video stream from the source channel. - */ - RELAY_EVENT_PACKET_RECEIVED_VIDEO_FROM_SRC = 5, - /** 6: The server receives the audio stream from the source channel. - */ - RELAY_EVENT_PACKET_RECEIVED_AUDIO_FROM_SRC = 6, - /** 7: The destination channel is updated. - */ - RELAY_EVENT_PACKET_UPDATE_DEST_CHANNEL = 7, - /** 8: The destination channel update fails due to internal reasons. - */ - RELAY_EVENT_PACKET_UPDATE_DEST_CHANNEL_REFUSED = 8, - /** 9: The destination channel does not change, which means that the destination channel fails to be updated. - */ - RELAY_EVENT_PACKET_UPDATE_DEST_CHANNEL_NOT_CHANGE = 9, - /** 10: The destination channel name is NULL. - */ - RELAY_EVENT_PACKET_UPDATE_DEST_CHANNEL_IS_NULL = 10, - /** 11: The video profile is sent to the server. - */ - RELAY_EVENT_VIDEO_PROFILE_UPDATE = 11, - /** 12: The SDK successfully pauses relaying the media stream to destination channels. - */ - RELAY_EVENT_PAUSE_SEND_PACKET_TO_DEST_CHANNEL_SUCCESS = 12, - /** 13: The SDK fails to pause relaying the media stream to destination channels. - */ - RELAY_EVENT_PAUSE_SEND_PACKET_TO_DEST_CHANNEL_FAILED = 13, - /** 14: The SDK successfully resumes relaying the media stream to destination channels. - */ - RELAY_EVENT_RESUME_SEND_PACKET_TO_DEST_CHANNEL_SUCCESS = 14, - /** 15: The SDK fails to resume relaying the media stream to destination channels. - */ - RELAY_EVENT_RESUME_SEND_PACKET_TO_DEST_CHANNEL_FAILED = 15, -}; /** * The state code of the channel media relay. */ @@ -5247,17 +5286,20 @@ enum CHANNEL_MEDIA_RELAY_STATE { /** The definition of ChannelMediaInfo. */ struct ChannelMediaInfo { - /** The channel name. The default value is NULL, which means that the SDK - * applies the current channel name. + /** The user ID. */ + uid_t uid; + /** The channel name. The default value is NULL, which means that the SDK + * applies the current channel name. + */ const char* channelName; - /** The token that enables the user to join the channel. The default value - * is NULL, which means that the SDK applies the current token. - */ + /** The token that enables the user to join the channel. The default value + * is NULL, which means that the SDK applies the current token. + */ const char* token; - /** The user ID. - */ - uid_t uid; + + ChannelMediaInfo() : uid(0), channelName(NULL), token(NULL) {} + ChannelMediaInfo(const char* c, const char* t, uid_t u) : uid(u), channelName(c), token(t) {} }; /** The definition of ChannelMediaRelayConfiguration. @@ -5275,7 +5317,7 @@ struct ChannelMediaRelayConfiguration { * - If you have enabled the App Certificate, you must use the token generated with the `channelName` and `uid`, and * the `uid` must be set as 0. */ - ChannelMediaInfo *srcInfo; + ChannelMediaInfo* srcInfo; /** The information of the destination channel `ChannelMediaInfo`. It contains the following members: * - `channelName`: The name of the destination channel. * - `uid`: The unique ID to identify the relay stream in the destination channel. The value @@ -5290,18 +5332,14 @@ struct ChannelMediaRelayConfiguration { * If you have enabled the App Certificate, you must use the token generated with the `channelName` * and `uid`. */ - ChannelMediaInfo *destInfos; + ChannelMediaInfo* destInfos; /** The number of destination channels. The default value is 0, and the value range is from 0 to * 6. Ensure that the value of this parameter corresponds to the number of `ChannelMediaInfo` * structs you define in `destInfo`. */ int destCount; - ChannelMediaRelayConfiguration() - : srcInfo(NULL), - destInfos(NULL), - destCount(0) - {} + ChannelMediaRelayConfiguration() : srcInfo(OPTIONAL_NULLPTR), destInfos(OPTIONAL_NULLPTR), destCount(0) {} }; /** @@ -5320,15 +5358,12 @@ struct UplinkNetworkInfo { } }; -/** - * The collections of downlink network info. - */ struct DownlinkNetworkInfo { struct PeerDownlinkInfo { /** * The ID of the user who owns the remote video stream. */ - const char* uid; + const char* userId; /** * The remote video stream type: #VIDEO_STREAM_TYPE. */ @@ -5343,28 +5378,41 @@ struct DownlinkNetworkInfo { int expected_bitrate_bps; PeerDownlinkInfo() - : uid(OPTIONAL_NULLPTR), + : userId(OPTIONAL_NULLPTR), stream_type(VIDEO_STREAM_HIGH), current_downscale_level(REMOTE_VIDEO_DOWNSCALE_LEVEL_NONE), expected_bitrate_bps(-1) {} + PeerDownlinkInfo(const PeerDownlinkInfo& rhs) + : stream_type(rhs.stream_type), + current_downscale_level(rhs.current_downscale_level), + expected_bitrate_bps(rhs.expected_bitrate_bps) { + if (rhs.userId != OPTIONAL_NULLPTR) { + const int len = std::strlen(rhs.userId); + char* buf = new char[len + 1]; + std::memcpy(buf, rhs.userId, len); + buf[len] = '\0'; + userId = buf; + } + } + PeerDownlinkInfo& operator=(const PeerDownlinkInfo& rhs) { if (this == &rhs) return *this; - uid = OPTIONAL_NULLPTR; + userId = OPTIONAL_NULLPTR; stream_type = rhs.stream_type; current_downscale_level = rhs.current_downscale_level; expected_bitrate_bps = rhs.expected_bitrate_bps; - if (rhs.uid != OPTIONAL_NULLPTR) { - char* temp = new char[strlen(rhs.uid) + 1]; - strcpy(temp, rhs.uid); - uid = temp; + if (rhs.userId != OPTIONAL_NULLPTR) { + const int len = std::strlen(rhs.userId); + char* buf = new char[len + 1]; + std::memcpy(buf, rhs.userId, len); + buf[len] = '\0'; + userId = buf; } return *this; } - ~PeerDownlinkInfo() { - if (uid) { delete [] uid; } - } + ~PeerDownlinkInfo() { delete[] userId; } }; /** @@ -5422,9 +5470,7 @@ struct DownlinkNetworkInfo { return *this; } - ~DownlinkNetworkInfo() { - if (peer_downlink_info) delete [] peer_downlink_info; - } + ~DownlinkNetworkInfo() { delete[] peer_downlink_info; } }; /** @@ -5487,7 +5533,7 @@ struct EncryptionConfig { EncryptionConfig() : encryptionMode(AES_128_GCM2), - encryptionKey(NULL) + encryptionKey(OPTIONAL_NULLPTR) { memset(encryptionKdfSalt, 0, sizeof(encryptionKdfSalt)); } @@ -5558,15 +5604,6 @@ enum PERMISSION_TYPE { SCREEN_CAPTURE = 2, }; -/** The maximum length of the user account. - */ -enum MAX_USER_ACCOUNT_LENGTH_TYPE -{ - /** The maximum length of the user account is 256 bytes. - */ - MAX_USER_ACCOUNT_LENGTH = 256 -}; - /** * The subscribing state. */ @@ -5654,8 +5691,8 @@ struct UserInfo { * The user account. The maximum data length is `MAX_USER_ACCOUNT_LENGTH_TYPE`. */ char userAccount[MAX_USER_ACCOUNT_LENGTH]; - UserInfo() - : uid(0) { + + UserInfo() : uid(0) { userAccount[0] = '\0'; } }; @@ -5896,22 +5933,6 @@ enum CONFIG_FETCH_TYPE { }; -/** - * media recorder source stream information - */ -struct RecorderStreamInfo { - /** - * The channel ID of the video track. - */ - const char* channelId; - /** - * The user ID. - */ - uid_t uid; - RecorderStreamInfo() : channelId(NULL), uid(0) {} -}; - - /** The local proxy mode type. */ enum LOCAL_PROXY_MODE { /** 0: Connect local proxy with high priority, if not connected to local proxy, fallback to sdrtn. @@ -5975,7 +5996,21 @@ struct LocalAccessPointConfiguration { LocalAccessPointConfiguration() : ipList(NULL), ipListSize(0), domainList(NULL), domainListSize(0), verifyDomainName(NULL), mode(ConnectivityFirst) {} }; - +/** + * The information about recorded media streams. + */ +struct RecorderStreamInfo { + const char* channelId; + /** + * The user ID. + */ + uid_t uid; + /** + * The channel ID of the audio/video stream needs to be recorded. + */ + RecorderStreamInfo() : channelId(NULL), uid(0) {} + RecorderStreamInfo(const char* channelId, uid_t uid) : channelId(channelId), uid(uid) {} +}; } // namespace rtc namespace base { @@ -5994,9 +6029,9 @@ class AParameter : public agora::util::AutoPtr { private: bool initialize(IEngineBase* engine) { - IAgoraParameter* p = NULL; + IAgoraParameter* p = OPTIONAL_NULLPTR; if (engine && !engine->queryInterface(rtc::AGORA_IID_PARAMETER_ENGINE, (void**)&p)) reset(p); - return p != NULL; + return p != OPTIONAL_NULLPTR; } }; @@ -6048,7 +6083,47 @@ struct SpatialAudioParams { */ Optional enable_doppler; }; +/** + * Layout info of video stream which compose a transcoder video stream. +*/ +struct VideoLayout +{ + /** + * Channel Id from which this video stream come from. + */ + const char* channelId; + /** + * User id of video stream. + */ + rtc::uid_t uid; + /** + * User account of video stream. + */ + user_id_t strUid; + /** + * x coordinate of video stream on a transcoded video stream canvas. + */ + uint32_t x; + /** + * y coordinate of video stream on a transcoded video stream canvas. + */ + uint32_t y; + /** + * width of video stream on a transcoded video stream canvas. + */ + uint32_t width; + /** + * height of video stream on a transcoded video stream canvas. + */ + uint32_t height; + /** + * video state of video stream on a transcoded video stream canvas. + * 0 for normal video , 1 for placeholder image showed , 2 for black image. + */ + uint32_t videoState; + VideoLayout() : channelId(OPTIONAL_NULLPTR), uid(0), strUid(OPTIONAL_NULLPTR), x(0), y(0), width(0), height(0), videoState(0) {} +}; } // namespace agora /** diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraExtensionProviderEntry.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraExtensionProviderEntry.h index 66c43e71a..acae66ea4 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraExtensionProviderEntry.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraExtensionProviderEntry.h @@ -11,7 +11,7 @@ #include "NGIAgoraExtensionControl.h" AGORA_API agora::rtc::IExtensionControl* AGORA_CALL getAgoraExtensionControl(); AGORA_API void AGORA_CALL declareProviderVersion( - const char*, const agora::rtc::ExtensionVersion&); + const char*, const agora::rtc::ExtensionVersion&); typedef void(*agora_ext_entry_func_t)(void); AGORA_API void AGORA_CALL registerProviderEntry(const char*, agora_ext_entry_func_t); @@ -19,7 +19,7 @@ AGORA_API void AGORA_CALL registerProviderEntry(const char*, agora_ext_entry_fun static void register_##PROVIDER_NAME##_to_agora() { \ auto control = getAgoraExtensionControl(); \ agora::rtc::ExtensionVersion version = \ - agora::rtc::ExtensionInterfaceVersion::Version(); \ + agora::rtc::ExtensionInterfaceVersion::Version(); \ declareProviderVersion(#PROVIDER_NAME, version); \ if (#PROVIDER_NAME && control) { \ control->registerProvider(#PROVIDER_NAME, \ @@ -27,6 +27,18 @@ static void register_##PROVIDER_NAME##_to_agora() { } \ } \ +#define DECLARE_CREATE_AND_REGISTER_PROVIDER_PTR(PROVIDER_NAME, PROVIDER_INTERFACE_USED, PROVIDER_REF_PTR) \ +static void register_##PROVIDER_NAME##_to_agora() { \ + auto control = getAgoraExtensionControl(); \ + agora::rtc::ExtensionVersion version = \ + agora::rtc::ExtensionInterfaceVersion::Version(); \ + declareProviderVersion(#PROVIDER_NAME, version); \ + if (#PROVIDER_NAME && control) { \ + control->registerProvider(#PROVIDER_NAME, PROVIDER_REF_PTR); \ + } \ +} \ + + #if defined (__GNUC__) #define REGISTER_AGORA_EXTENSION_PROVIDER(PROVIDER_NAME, PROVIDER_CLASS, PROVIDER_INTERFACE_USED, ...) \ DECLARE_CREATE_AND_REGISTER_PROVIDER(PROVIDER_NAME, PROVIDER_CLASS, PROVIDER_INTERFACE_USED, __VA_ARGS__); \ @@ -35,6 +47,14 @@ static void _##PROVIDER_NAME##_provider_entry() { registerProviderEntry(#PROVIDER_NAME, register_##PROVIDER_NAME##_to_agora); \ } \ +#define REGISTER_AGORA_EXTENSION_PROVIDER_PTR(PROVIDER_NAME, PROVIDER_INTERFACE_USED, PROVIDER_REF_PTR) \ +DECLARE_CREATE_AND_REGISTER_PROVIDER_PTR(PROVIDER_NAME, PROVIDER_INTERFACE_USED, PROVIDER_REF_PTR); \ +__attribute__((constructor, used)) \ +static void _##PROVIDER_NAME##_provider_entry() { \ + registerProviderEntry(#PROVIDER_NAME, register_##PROVIDER_NAME##_to_agora); \ +} \ + + #elif defined (_MSC_VER) #define REGISTER_AGORA_EXTENSION_PROVIDER(PROVIDER_NAME, PROVIDER_CLASS, PROVIDER_INTERFACE_USED, ...) \ DECLARE_CREATE_AND_REGISTER_PROVIDER(PROVIDER_NAME, PROVIDER_CLASS, PROVIDER_INTERFACE_USED, __VA_ARGS__); \ @@ -44,6 +64,14 @@ static int _##PROVIDER_NAME##_provider_entry() { } \ const int DUMMY_AGORA_REGEXT_##PROVIDE_NAME##_VAR = _##PROVIDER_NAME##_provider_entry(); \ +#define REGISTER_AGORA_EXTENSION_PROVIDER_PTR(PROVIDER_NAME, PROVIDER_INTERFACE_USED, PROVIDER_REF_PTR) \ +DECLARE_CREATE_AND_REGISTER_PROVIDER_PTR(PROVIDER_NAME, PROVIDER_INTERFACE_USED, PROVIDER_REF_PTR); \ +static int _##PROVIDER_NAME##_provider_entry() { \ + registerProviderEntry(#PROVIDER_NAME, register_##PROVIDER_NAME##_to_agora); \ + return 0; \ +} \ +const int DUMMY_AGORA_REGEXT_##PROVIDE_NAME##_VAR = _##PROVIDER_NAME##_provider_entry(); \ + #else #error Unsupported Compilation Toolchain! #endif diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaBase.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaBase.h index bd0a7b881..15dfd4b38 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaBase.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaBase.h @@ -1,4 +1,3 @@ -// // Agora Engine SDK // // Created by Sting Feng in 2017-11. @@ -35,10 +34,8 @@ static const unsigned int INVALID_TRACK_ID = 0xffffffff; static const unsigned int DEFAULT_CONNECTION_ID = 0; static const unsigned int DUMMY_CONNECTION_ID = (std::numeric_limits::max)(); - struct EncodedVideoFrameInfo; - /** * Video source types definition. **/ @@ -105,45 +102,49 @@ enum AudioRoute */ ROUTE_DEFAULT = -1, /** - * The headset. + * The Headset. */ ROUTE_HEADSET = 0, /** - * The earpiece. + * The Earpiece. */ ROUTE_EARPIECE = 1, /** - * The headset with no microphone. + * The Headset with no microphone. */ ROUTE_HEADSETNOMIC = 2, /** - * The speakerphone. + * The Speakerphone. */ ROUTE_SPEAKERPHONE = 3, /** - * The loudspeaker. + * The Loudspeaker. */ ROUTE_LOUDSPEAKER = 4, /** - * The Bluetooth headset. + * The Bluetooth Headset via HFP. */ ROUTE_HEADSETBLUETOOTH = 5, /** - * The USB + * The USB. */ ROUTE_USB = 6, /** - * The HDMI + * The HDMI. */ ROUTE_HDMI = 7, /** - * The DISPLAYPORT + * The DisplayPort. */ ROUTE_DISPLAYPORT = 8, /** - * The AIRPLAY + * The AirPlay. */ ROUTE_AIRPLAY = 9, + /** + * The Bluetooth Speaker via A2DP. + */ + ROUTE_BLUETOOTH_SPEAKER = 10, }; /** @@ -551,6 +552,19 @@ enum CAMERA_VIDEO_SOURCE_TYPE { VIDEO_SOURCE_UNSPECIFIED = 2, }; +/** + * The IVideoFrameMetaInfo class. + * This interface provides access to metadata information. + */ +class IVideoFrameMetaInfo { + public: + enum META_INFO_KEY { + KEY_FACE_CAPTURE = 0, + }; + virtual ~IVideoFrameMetaInfo() {}; + virtual const char* getMetaInfoStr(META_INFO_KEY key) const = 0; +}; + /** * The definition of the ExternalVideoFrame struct. */ @@ -698,12 +712,12 @@ struct ExternalVideoFrame { uint8_t* alphaBuffer; /** - * [Windows Texture related parameter] The pointer of ID3D11Texture2D used by the video frame. + * [For Windows only] The pointer of ID3D11Texture2D used by the video frame. */ void *d3d11_texture_2d; /** - * [Windows Texture related parameter] The index of ID3D11Texture2D array used by the video frame. + * [For Windows only] The index of ID3D11Texture2D array used by the video frame. */ int texture_slice_index; }; @@ -731,7 +745,8 @@ struct VideoFrame { textureId(0), d3d11Texture2d(NULL), alphaBuffer(NULL), - pixelBuffer(NULL){ + pixelBuffer(NULL), + metaInfo(NULL){ memset(matrix, 0, sizeof(matrix)); } /** @@ -821,6 +836,10 @@ struct VideoFrame { *The type of CVPixelBufferRef, for iOS and macOS only. */ void* pixelBuffer; + /** + * The pointer to IVideoFrameMetaInfo, which is the interface to get metainfo contents from VideoFrame. + */ + IVideoFrameMetaInfo* metaInfo; }; /** @@ -865,6 +884,7 @@ enum VIDEO_MODULE_POSITION { POSITION_POST_CAPTURER = 1 << 0, POSITION_PRE_RENDERER = 1 << 1, POSITION_PRE_ENCODER = 1 << 2, + POSITION_POST_CAPTURER_ORIGIN = 1 << 3, }; } // namespace base @@ -942,10 +962,6 @@ class IAudioFrameObserverBase { * are used. */ int64_t renderTimeMs; - /** - * A reserved parameter. - */ - int avsync_type; /** * A reserved parameter. * @@ -953,7 +969,18 @@ class IAudioFrameObserverBase { * this will then filled into audio4 extension part, the remote side could use this pts in av * sync process with video frame. */ + int avsync_type; + /** + * The pts timestamp of this audio frame. + * + * This timestamp is used to indicate the origin pts time of the frame, and sync with video frame by + * the pts time stamp + */ int64_t presentationMs; + /** + * The number of the audio track. + */ + int audioTrackNumber; AudioFrame() : type(FRAME_TYPE_PCM16), samplesPerChannel(0), @@ -963,7 +990,8 @@ class IAudioFrameObserverBase { buffer(NULL), renderTimeMs(0), avsync_type(0), - presentationMs(0) {} + presentationMs(0), + audioTrackNumber(0) {} }; enum AUDIO_FRAME_POSITION { @@ -1180,9 +1208,9 @@ struct UserAudioSpectrumInfo { */ struct AudioSpectrumData spectrumData; - UserAudioSpectrumInfo () : uid(0), spectrumData() {} - UserAudioSpectrumInfo(agora::rtc::uid_t _uid, const float *data, int length) : - uid(_uid) { spectrumData.audioSpectrumData = data; spectrumData.dataLength = length; } + UserAudioSpectrumInfo() : uid(0) {} + + UserAudioSpectrumInfo(agora::rtc::uid_t uid, const float* data, int length) : uid(uid), spectrumData(data, length) {} }; /** @@ -1205,7 +1233,6 @@ class IAudioSpectrumObserver { * - false: Not processed. */ virtual bool onLocalAudioSpectrum(const AudioSpectrumData& data) = 0; - /** * Reports the audio spectrum of remote user. * @@ -1223,7 +1250,7 @@ class IAudioSpectrumObserver { * - true: Processed. * - false: Not processed. */ - virtual bool onRemoteAudioSpectrum(const UserAudioSpectrumInfo * spectrums, unsigned int spectrumNumber) = 0; + virtual bool onRemoteAudioSpectrum(const UserAudioSpectrumInfo* spectrums, unsigned int spectrumNumber) = 0; }; /** @@ -1505,7 +1532,7 @@ enum MediaRecorderStreamType { */ enum RecorderState { /** - * -1: An error occurs during the recording. See RecorderErrorCode for the reason. + * -1: An error occurs during the recording. See RecorderReasonCode for the reason. */ RECORDER_STATE_ERROR = -1, /** @@ -1522,27 +1549,27 @@ enum RecorderState { * * @since v3.5.2 */ -enum RecorderErrorCode { +enum RecorderReasonCode { /** * 0: No error occurs. */ - RECORDER_ERROR_NONE = 0, + RECORDER_REASON_NONE = 0, /** * 1: The SDK fails to write the recorded data to a file. */ - RECORDER_ERROR_WRITE_FAILED = 1, + RECORDER_REASON_WRITE_FAILED = 1, /** * 2: The SDK does not detect audio and video streams to be recorded, or audio and video streams are interrupted for more than five seconds during recording. */ - RECORDER_ERROR_NO_STREAM = 2, + RECORDER_REASON_NO_STREAM = 2, /** * 3: The recording duration exceeds the upper limit. */ - RECORDER_ERROR_OVER_MAX_DURATION = 3, + RECORDER_REASON_OVER_MAX_DURATION = 3, /** * 4: The recording configuration changes. */ - RECORDER_ERROR_CONFIG_CHANGED = 4, + RECORDER_REASON_CONFIG_CHANGED = 4, }; /** * Configurations for the local audio and video recording. @@ -1605,7 +1632,6 @@ struct RecorderInfo { RecorderInfo(const char* name, unsigned int dur, unsigned int size) : fileName(name), durationMs(dur), fileSize(size) {} }; - class IMediaRecorderObserver { public: /** @@ -1619,9 +1645,9 @@ class IMediaRecorderObserver { * @param channelId The channel name. * @param uid ID of the user. * @param state The current recording state. See \ref agora::media::RecorderState "RecorderState". - * @param error The reason for the state change. See \ref agora::media::RecorderErrorCode "RecorderErrorCode". + * @param reason The reason for the state change. See \ref agora::media::RecorderReasonCode "RecorderReasonCode". */ - virtual void onRecorderStateChanged(const char* channelId, rtc::uid_t uid, RecorderState state, RecorderErrorCode error) = 0; + virtual void onRecorderStateChanged(const char* channelId, rtc::uid_t uid, RecorderState state, RecorderReasonCode reason) = 0; /** * Occurs when the recording information is updated. * @@ -1637,7 +1663,9 @@ class IMediaRecorderObserver { * */ virtual void onRecorderInfoUpdated(const char* channelId, rtc::uid_t uid, const RecorderInfo& info) = 0; + virtual ~IMediaRecorderObserver() {} }; + } // namespace media } // namespace agora diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaPlayerTypes.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaPlayerTypes.h index d1bb17bb3..3beaba788 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaPlayerTypes.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraMediaPlayerTypes.h @@ -105,61 +105,61 @@ enum MEDIA_PLAYER_STATE { * @brief Player error code * */ -enum MEDIA_PLAYER_ERROR { +enum MEDIA_PLAYER_REASON { /** No error. */ - PLAYER_ERROR_NONE = 0, + PLAYER_REASON_NONE = 0, /** The parameter is invalid. */ - PLAYER_ERROR_INVALID_ARGUMENTS = -1, + PLAYER_REASON_INVALID_ARGUMENTS = -1, /** Internel error. */ - PLAYER_ERROR_INTERNAL = -2, + PLAYER_REASON_INTERNAL = -2, /** No resource. */ - PLAYER_ERROR_NO_RESOURCE = -3, + PLAYER_REASON_NO_RESOURCE = -3, /** Invalid media source. */ - PLAYER_ERROR_INVALID_MEDIA_SOURCE = -4, + PLAYER_REASON_INVALID_MEDIA_SOURCE = -4, /** The type of the media stream is unknown. */ - PLAYER_ERROR_UNKNOWN_STREAM_TYPE = -5, + PLAYER_REASON_UNKNOWN_STREAM_TYPE = -5, /** The object is not initialized. */ - PLAYER_ERROR_OBJ_NOT_INITIALIZED = -6, + PLAYER_REASON_OBJ_NOT_INITIALIZED = -6, /** The codec is not supported. */ - PLAYER_ERROR_CODEC_NOT_SUPPORTED = -7, + PLAYER_REASON_CODEC_NOT_SUPPORTED = -7, /** Invalid renderer. */ - PLAYER_ERROR_VIDEO_RENDER_FAILED = -8, + PLAYER_REASON_VIDEO_RENDER_FAILED = -8, /** An error occurs in the internal state of the player. */ - PLAYER_ERROR_INVALID_STATE = -9, + PLAYER_REASON_INVALID_STATE = -9, /** The URL of the media file cannot be found. */ - PLAYER_ERROR_URL_NOT_FOUND = -10, + PLAYER_REASON_URL_NOT_FOUND = -10, /** Invalid connection between the player and the Agora server. */ - PLAYER_ERROR_INVALID_CONNECTION_STATE = -11, + PLAYER_REASON_INVALID_CONNECTION_STATE = -11, /** The playback buffer is insufficient. */ - PLAYER_ERROR_SRC_BUFFER_UNDERFLOW = -12, + PLAYER_REASON_SRC_BUFFER_UNDERFLOW = -12, /** The audio mixing file playback is interrupted. */ - PLAYER_ERROR_INTERRUPTED = -13, + PLAYER_REASON_INTERRUPTED = -13, /** The SDK does not support this function. */ - PLAYER_ERROR_NOT_SUPPORTED = -14, + PLAYER_REASON_NOT_SUPPORTED = -14, /** The token has expired. */ - PLAYER_ERROR_TOKEN_EXPIRED = -15, + PLAYER_REASON_TOKEN_EXPIRED = -15, /** The ip has expired. */ - PLAYER_ERROR_IP_EXPIRED = -16, + PLAYER_REASON_IP_EXPIRED = -16, /** An unknown error occurs. */ - PLAYER_ERROR_UNKNOWN = -17, + PLAYER_REASON_UNKNOWN = -17, }; /** @@ -357,18 +357,60 @@ struct CacheStatistics { int64_t downloadSize; }; -struct PlayerUpdatedInfo { - /** playerId has value when user trigger interface of opening +/** + * @brief The real time statistics of the media stream being played. + * + */ +struct PlayerPlaybackStats { + /** Video fps. */ - Optional playerId; - - /** deviceId has value when user trigger interface of opening + int videoFps; + /** Video bitrate (Kbps). + */ + int videoBitrateInKbps; + /** Audio bitrate (Kbps). */ - Optional deviceId; + int audioBitrateInKbps; + /** Total bitrate (Kbps). + */ + int totalBitrateInKbps; +}; - /** cacheStatistics exist if you enable cache, triggered 1s at a time after openning url +/** + * @brief The updated information of media player. + * + */ +struct PlayerUpdatedInfo { + /** @technical preview */ - Optional cacheStatistics; + const char* internalPlayerUuid; + /** The device ID of the playback device. + */ + const char* deviceId; + /** Video height. + */ + int videoHeight; + /** Video width. + */ + int videoWidth; + /** Audio sample rate. + */ + int audioSampleRate; + /** The audio channel number. + */ + int audioChannels; + /** The bit number of each audio sample. + */ + int audioBitsPerSample; + + PlayerUpdatedInfo() + : internalPlayerUuid(NULL), + deviceId(NULL), + videoHeight(0), + videoWidth(0), + audioSampleRate(0), + audioChannels(0), + audioBitsPerSample(0) {} }; /** @@ -436,6 +478,17 @@ struct MediaSource { * - false: (Default) Disable cache. */ bool enableCache; + /** + * Determines whether to enable multi-track audio stream decoding. + * Then you can select multi audio track of the media file for playback or publish to channel + * + * @note + * If you use the selectMultiAudioTrack API, you must set enableMultiAudioTrack to true. + * + * - true: Enable MultiAudioTrack;. + * - false: (Default) Disable MultiAudioTrack;. + */ + bool enableMultiAudioTrack; /** * Determines whether the opened media resource is a stream through the Agora Broadcast Streaming Network(CDN). * - true: It is a stream through the Agora Broadcast Streaming Network. @@ -454,7 +507,7 @@ struct MediaSource { IMediaPlayerCustomDataProvider* provider; MediaSource() : url(NULL), uri(NULL), startPos(0), autoPlay(true), enableCache(false), - provider(NULL){ + enableMultiAudioTrack(false), provider(NULL){ } }; diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraRefCountedObject.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraRefCountedObject.h index a457ef7db..93c970ed3 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraRefCountedObject.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/AgoraRefCountedObject.h @@ -7,6 +7,14 @@ // of Agora.io. #pragma once +#ifndef __AGORA_REF_COUNTED_OBJECT_H__ +#define __AGORA_REF_COUNTED_OBJECT_H__ +#endif + +#if defined(__AGORA_REF_COUNTED_OBJECT_INTERNAL_H__) +#error AgoraRefCountedObject is deprected now, its only purpose is for API compatiable. +#endif + #include "AgoraRefPtr.h" #include "AgoraAtomicOps.h" diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraFileUploader.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraFileUploader.h index 1ccd36718..f0611fe38 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraFileUploader.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraFileUploader.h @@ -9,6 +9,7 @@ #pragma once // NOLINT(build/header_guard) #include "AgoraRefPtr.h" +#include namespace agora { namespace rtc { @@ -37,8 +38,8 @@ struct ImagePayloadData { class IFileUploaderService : public RefCountInterface { public: virtual ~IFileUploaderService() {} - virtual int startImageUpload(const ImagePayloadData* imgData) = 0; - virtual int stopImageUpload() = 0; + virtual int startImageUpload(const ImagePayloadData* imgData, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; + virtual int stopImageUpload(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; }; } // namespace rtc } // namespace agora diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraLog.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraLog.h index f3952163a..2fae3aa13 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraLog.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraLog.h @@ -75,7 +75,7 @@ const uint32_t MAX_LOG_SIZE = 20 * 1024 * 1024; // 20MB const uint32_t MIN_LOG_SIZE = 128 * 1024; // 128KB /** The default log size in kb */ -const uint32_t DEFAULT_LOG_SIZE_IN_KB = 1024; +const uint32_t DEFAULT_LOG_SIZE_IN_KB = 2048; /** Definition of LogConfiguration */ @@ -83,7 +83,7 @@ struct LogConfig { /**The log file path, default is NULL for default log path */ const char* filePath; - /** The log file size, KB , set 1024KB to use default log size + /** The log file size, KB , set 2048KB to use default log size */ uint32_t fileSizeInKB; /** The log level, set LOG_LEVEL_INFO to use default log level diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraMediaPlayerSource.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraMediaPlayerSource.h index 8d1a95be0..00be02233 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraMediaPlayerSource.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraMediaPlayerSource.h @@ -118,7 +118,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param [out] pos A reference to the current playback position (ms). * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int getPlayPosition(int64_t& pos) = 0; @@ -127,7 +127,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param [out] count The number of the media streams in the media source. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int getStreamCount(int64_t& count) = 0; @@ -137,7 +137,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param [out] info The detailed information of the media stream. See \ref media::base::PlayerStreamInfo "PlayerStreamInfo" for details. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int getStreamInfo(int64_t index, media::base::PlayerStreamInfo* info) = 0; @@ -149,7 +149,7 @@ class IMediaPlayerSource : public RefCountInterface { * - -1: Play the media file in a loop indefinitely, until {@link stop} is called. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int setLoopCount(int64_t loopCount) = 0; @@ -158,7 +158,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param speed The playback speed ref [50-400]. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int setPlaybackSpeed(int speed) = 0; @@ -167,17 +167,34 @@ class IMediaPlayerSource : public RefCountInterface { * @param index The index of the audio track in media file. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int selectAudioTrack(int64_t index) = 0; + /** + * Selects multi audio track of the media file for playback or publish to channel. + * @param playoutTrackIndex The index of the audio track in media file for local playback. + * @param publishTrackIndex The index of the audio track in the media file published to the remote. + * + * @note + * You can obtain the streamIndex of the audio track by calling getStreamInfo.. + * If you want to use selectMultiAudioTrack, you need to open the media file with openWithMediaSource and set enableMultiAudioTrack to true. + * + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + * - -2: Invalid argument. Argument must be greater than or equal to zero. + * - -8: Invalid State.You must open the media file with openWithMediaSource and set enableMultiAudioTrack to true + */ + virtual int selectMultiAudioTrack(int playoutTrackIndex, int publishTrackIndex) = 0; + /** * Changes the player option before playing a file. * @param key The key of the option paramemter. * @param value The value of option parameter. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int setPlayerOption(const char* key, int64_t value) = 0; @@ -187,7 +204,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param value The value of option parameter. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int setPlayerOption(const char* key, const char* value) = 0; @@ -196,7 +213,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param filename The filename of the screenshot file. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int takeScreenshot(const char* filename) = 0; @@ -205,7 +222,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param index The index of the internal subtitles. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int selectInternalSubtitle(int64_t index) = 0; @@ -214,7 +231,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param url The URL of the subtitle file. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int setExternalSubtitle(const char* url) = 0; @@ -231,7 +248,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param observer The pointer to the IMediaPlayerSourceObserver object. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int registerPlayerSourceObserver(IMediaPlayerSourceObserver* observer) = 0; @@ -240,7 +257,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param observer The pointer to the IMediaPlayerSourceObserver object. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int unregisterPlayerSourceObserver(IMediaPlayerSourceObserver* observer) = 0; @@ -250,7 +267,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param observer The pointer to the {@link media::IAudioPcmFrameSink observer} object. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; @@ -259,7 +276,7 @@ class IMediaPlayerSource : public RefCountInterface { * @param observer The pointer to the {@link media::IAudioPcmFrameSink observer} object. * @return * - 0: Success. - * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual int unregisterAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; @@ -380,18 +397,19 @@ class IMediaPlayerSourceObserver { * * When the state of the playback changes, the SDK triggers this callback to report the new playback state and the reason or error for the change. * @param state The new playback state after change. See {@link media::base::MEDIA_PLAYER_STATE MEDIA_PLAYER_STATE}. - * @param ec The player's error code. See {@link media::base::MEDIA_PLAYER_ERROR MEDIA_PLAYER_ERROR}. + * @param reason The player's error code. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. */ virtual void onPlayerSourceStateChanged(media::base::MEDIA_PLAYER_STATE state, - media::base::MEDIA_PLAYER_ERROR ec) = 0; + media::base::MEDIA_PLAYER_REASON reason) = 0; /** * @brief Reports current playback progress. * * The callback occurs once every one second during the playback and reports the current playback progress. - * @param position Current playback progress (milisecond). + * @param positionMs Current playback progress (milisecond). + * @param timestampMs Current NTP(Network Time Protocol) time (milisecond). */ - virtual void onPositionChanged(int64_t position_ms) = 0; + virtual void onPositionChanged(int64_t positionMs, int64_t timestampMs) = 0; /** * @brief Reports the playback event. @@ -455,6 +473,24 @@ class IMediaPlayerSourceObserver { * @param info Include information of media player. */ virtual void onPlayerInfoUpdated(const media::base::PlayerUpdatedInfo& info) = 0; + + /** + * @brief Triggered every 1 second, reports the statistics of the files being cached. + * + * @param stats Cached file statistics. + */ + virtual void onPlayerCacheStats(const media::base::CacheStatistics& stats) { + (void)stats; + } + + /** + * @brief Triggered every 1 second, reports the statistics of the media stream being played. + * + * @param stats The statistics of the media stream. + */ + virtual void onPlayerPlaybackStats(const media::base::PlayerPlaybackStats& stats) { + (void)stats; + } /** * @brief Triggered every 200 millisecond ,update player current volume range [0,255] diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraParameter.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraParameter.h index 70ea5939e..b88969e1d 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraParameter.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraParameter.h @@ -15,6 +15,7 @@ */ #pragma once // NOLINT(build/header_guard) +#include "AgoraRefPtr.h" // external key /** @@ -115,6 +116,7 @@ * set the video codec type, such as "H264", "JPEG" */ #define KEY_RTC_VIDEO_CODEC_TYPE "engine.video.codec_type" +#define KEY_RTC_VIDEO_MINOR_STREAM_CODEC_TYPE "engine.video.minor_stream_codec_type" #define KEY_RTC_VIDEO_CODEC_INDEX "che.video.videoCodecIndex" /** * only use average QP for quality scaling @@ -144,7 +146,7 @@ typedef CopyableAutoPtr AString; namespace base { -class IAgoraParameter { +class IAgoraParameter : public RefCountInterface { public: /** * release the resource @@ -300,6 +302,7 @@ class IAgoraParameter { virtual int convertPath(const char* filePath, agora::util::AString& value) = 0; + protected: virtual ~IAgoraParameter() {} }; diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmService.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmService.h index 70dbbf1c1..1ce7c5f91 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmService.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmService.h @@ -7,6 +7,7 @@ #pragma once #include +#include namespace agora { @@ -519,7 +520,7 @@ class IChannel { * Sets an event handler for IChannel. * @param eventHandler The pointer to the event handler of IChannel: IChannelEventHandler. */ - virtual void setEventHandler(IChannelEventHandler *eventHandler) = 0; + virtual int setEventHandler(IChannelEventHandler *eventHandler, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Joins the current channel. * @@ -529,7 +530,7 @@ class IChannel { * - 0: Success. * - < 0: Failure. */ - virtual int join() = 0; + virtual int join(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Leaves the current channel. * @@ -538,7 +539,7 @@ class IChannel { * - 0: Success. * - < 0: Failure. */ - virtual int leave() = 0; + virtual int leave(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sends a channel message. * @@ -549,7 +550,7 @@ class IChannel { * - 0: Success. * - < 0: Failure. */ - virtual int sendMessage(const IMessage *message) = 0; + virtual int sendMessage(const IMessage *message, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Updates the channel attributes. * @@ -560,7 +561,7 @@ class IChannel { * - 0: Success. * - < 0: Failure. */ - virtual int updateAttributes(IChannelAttributes *attributes, int64_t &requestId) = 0; + virtual int updateAttributes(IChannelAttributes *attributes, int64_t &requestId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Removes the channel attributes. * @@ -571,7 +572,7 @@ class IChannel { * - 0: Success. * - < 0: Failure. */ - virtual int deleteAttributes(IChannelAttributes *attributes, int64_t &requestId) = 0; + virtual int deleteAttributes(IChannelAttributes *attributes, int64_t &requestId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current request ID. * @return @@ -674,14 +675,14 @@ class IRtmService { * - 0: Success. * - < 0: Failure. */ - virtual int login(const char *token, const char *userId) = 0; + virtual int login(const char *token, const char *userId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Logs out of the RTM service. * @return * - 0: Success. * - < 0: Failure. */ - virtual int logout() = 0; + virtual int logout(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sends a peer message to a specified remote user. * @@ -691,7 +692,7 @@ class IRtmService { * - 0: Success. * - < 0: Failure. */ - virtual int sendMessageToPeer(const char *peerId, const IMessage *message) = 0; + virtual int sendMessageToPeer(const char *peerId, const IMessage *message, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Creates an RTM channel. * diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmpStreamingService.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmpStreamingService.h index 59736d4ab..0f6a1eec8 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmpStreamingService.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraRtmpStreamingService.h @@ -12,6 +12,7 @@ #include "AgoraRefPtr.h" #include "IAgoraService.h" #include "NGIAgoraRtcConnection.h" +#include namespace agora { namespace rtc { @@ -51,13 +52,13 @@ class IRtmpStreamingObserver { * * @param url The RTMP URL address. * @param state The RTMP streaming state: #RTMP_STREAM_PUBLISH_STATE. - * @param errCode The detailed error information for streaming: #RTMP_STREAM_PUBLISH_ERROR_TYPE. + * @param reason The detailed error information for streaming: #RTMP_STREAM_PUBLISH_REASON. */ virtual void onRtmpStreamingStateChanged(const char* url, RTMP_STREAM_PUBLISH_STATE state, - RTMP_STREAM_PUBLISH_ERROR_TYPE errCode) { + RTMP_STREAM_PUBLISH_REASON reason) { (void)url; (void)state; - (void)errCode; + (void)reason; } /** Reports events during the RTMP or RTMPS streaming. @@ -109,7 +110,7 @@ class IRtmpStreamingService : public RefCountInterface { * - #ERR_NOT_INITIALIZED (7): You have not initialized the RTC engine when publishing the stream. * - #ERR_ALREADY_IN_USE (19): This streaming URL is already in use. Use a new streaming URL for CDN streaming. */ - virtual int startRtmpStreamWithoutTranscoding(const char* url) = 0; + virtual int startRtmpStreamWithoutTranscoding(const char* url, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Publishes the local stream with transcoding to a specified CDN live RTMP address. (CDN live only.) @@ -132,7 +133,7 @@ class IRtmpStreamingService : public RefCountInterface { * - #ERR_NOT_INITIALIZED (7): You have not initialized the RTC engine when publishing the stream. * - #ERR_ALREADY_IN_USE (19): This streaming URL is already in use. Use a new streaming URL for CDN streaming. */ - virtual int startRtmpStreamWithTranscoding(const char* url, const LiveTranscoding& transcoding) = 0; + virtual int startRtmpStreamWithTranscoding(const char* url, const LiveTranscoding& transcoding, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Update the video layout and audio settings for CDN live. (CDN live only.) * @note This method applies to Live Broadcast only. @@ -143,7 +144,7 @@ class IRtmpStreamingService : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int updateRtmpTranscoding(const LiveTranscoding& transcoding) = 0; + virtual int updateRtmpTranscoding(const LiveTranscoding& transcoding, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Stop an RTMP stream with transcoding or without transcoding from the CDN. (CDN live only.) * This method removes the RTMP URL address (added by the \ref IRtcEngine::startRtmpStreamWithoutTranscoding "startRtmpStreamWithoutTranscoding" method @@ -162,7 +163,7 @@ class IRtmpStreamingService : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int stopRtmpStream(const char* url) = 0; + virtual int stopRtmpStream(const char* url, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers an RTMP streaming observer. @@ -171,7 +172,7 @@ class IRtmpStreamingService : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int registerObserver(IRtmpStreamingObserver* observer) = 0; + virtual int registerObserver(IRtmpStreamingObserver* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the RTMP streaming observer created by registerObserver(). * @param observer The pointer to the RTMP streaming observer that you want to release. See \ref agora::rtc::IRtmpStreamingObserver "IRtmpStreamingObserver". diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraService.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraService.h index 8890b01d0..9feb0c914 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraService.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/IAgoraService.h @@ -8,6 +8,7 @@ #include "IAgoraLog.h" #include "AgoraBase.h" #include "AgoraOptional.h" +#include namespace agora { class ILocalDataChannel; @@ -392,7 +393,7 @@ class IAgoraService { /** * Flush log & cache before exit */ - virtual void atExit() = 0; + virtual int atExit(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::base::IAgoraService "AgoraService" object. @@ -412,7 +413,7 @@ class IAgoraService { * - 0: Success. * - < 0: Failure. */ - virtual int setAudioSessionPreset(agora::rtc::AUDIO_SCENARIO_TYPE scenario) = 0; + virtual int setAudioSessionPreset(agora::rtc::AUDIO_SCENARIO_TYPE scenario, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Customizes the audio session configuration. @@ -422,7 +423,7 @@ class IAgoraService { * - 0: Success. * - < 0: Failure. */ - virtual int setAudioSessionConfiguration(const AudioSessionConfiguration& config) = 0; + virtual int setAudioSessionConfiguration(const AudioSessionConfiguration& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the audio session configuration. @@ -448,12 +449,12 @@ class IAgoraService { * \ref agora::base::IAgoraService::initialize "initialize". * * @param filePath The pointer to the log file. Ensure that the directory of the log file exists and is writable. - * @param fileSize The size of the SDK log file size (KB). + * @param fileSize The size of the SDK log file size (Byte), which means fileSize bytes per log file. * @return * - 0: Success. * - < 0: Failure. */ - virtual int setLogFile(const char* filePath, unsigned int fileSize) = 0; + virtual int setLogFile(const char* filePath, unsigned int fileSize, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the SDK log output filter. * @@ -472,7 +473,7 @@ class IAgoraService { * - 0: Success. * - < 0: Failure. */ - virtual int setLogFilter(unsigned int filters) = 0; + virtual int setLogFilter(unsigned int filters, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Creates an \ref agora::rtc::IRtcConnection "RtcConnection" object and returns the pointer. @@ -501,6 +502,17 @@ class IAgoraService { */ virtual agora_refptr createLocalAudioTrack() = 0; + /** + * Creates a local mixed audio track object and returns the pointer. + * + * By default, the audio track is created from mix source, which could mixed target track. + * + * @return + * - The pointer to \ref rtc::ILocalAudioTrack "ILocalAudioTrack": Success. + * - A null pointer: Failure. + */ + virtual agora_refptr createLocalMixedAudioTrack() = 0; + /** * Creates a local audio track object with a PCM data sender and returns the pointer. * @@ -662,19 +674,6 @@ class IAgoraService { virtual agora_refptr createCameraVideoTrack( agora_refptr videoSource, const char* id = OPTIONAL_NULLPTR) = 0; - /** - * Creates a local video track object with a screen capturer and returns the pointer. - * - * Once created, this track can be used to send video data for screen sharing. - * - * @param videoSource The pointer to the screen capturer: \ref agora::rtc::IScreenCapturer "IScreenCapturer". - * @return - * - The pointer to \ref rtc::ILocalVideoTrack "ILocalVideoTrack": Success. - * - A null pointer: Failure. - */ - virtual agora_refptr createScreenVideoTrack( - agora_refptr videoSource, const char* id = OPTIONAL_NULLPTR) = 0; - /** * Creates a local video track object with a video mixer and returns the pointer. * @@ -867,9 +866,9 @@ class IAgoraService { */ virtual rtm::IRtmService* createRtmService() = 0; - virtual int addExtensionObserver(agora::agora_refptr observer) = 0; + virtual int addExtensionObserver(agora::agora_refptr observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; - virtual int removeExtensionObserver(agora::agora_refptr observer) = 0; + virtual int removeExtensionObserver(agora::agora_refptr observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Creates an audio device manager and returns the pointer. @@ -925,7 +924,7 @@ class IAgoraService { */ virtual int enableExtension( const char* provider_name, const char* extension_name, const char* track_id = NULL, - bool auto_enable_on_track = false) = 0; + bool auto_enable_on_track = false, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Disable extension. * @@ -938,7 +937,16 @@ class IAgoraService { * - < 0: Failure. */ virtual int disableExtension( - const char* provider_name, const char* extension_name, const char* track_id = NULL) = 0; + const char* provider_name, const char* extension_name, const char* track_id = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; + + /** + * Gets the IAgoraParameter object. + * @since 4.3.0 + * @return + * - The pointer to the \ref agora::base::IAgoraParameter "IAgoraParameter" object. + * - A null pointer: Failure. + */ + virtual agora_refptr getAgoraParameter() = 0; /** * Get the \ref agora::rtc::IConfigCenter "IConfigCenter" object and return the pointer. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioDeviceManager.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioDeviceManager.h index c884a3951..6374ca11c 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioDeviceManager.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioDeviceManager.h @@ -10,6 +10,7 @@ #include "AgoraBase.h" #include "AgoraRefPtr.h" +#include namespace agora { namespace media { namespace base { @@ -146,7 +147,7 @@ class IRecordingDeviceSource : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int startRecording() = 0; + virtual int startRecording(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stop the recording device. @@ -154,7 +155,7 @@ class IRecordingDeviceSource : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int stopRecording() = 0; + virtual int stopRecording(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers an audio frame observer. @@ -164,7 +165,7 @@ class IRecordingDeviceSource : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; + virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the registered IAudioFrameObserver object. @@ -182,7 +183,7 @@ class IRecordingDeviceSource : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setLoopbackDeviceParameter(const LoopbackRecordingOption &option) = 0; + virtual int setLoopbackDeviceParameter(const LoopbackRecordingOption &option, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; virtual ~IRecordingDeviceSource() {} }; @@ -212,7 +213,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setMicrophoneVolume(unsigned int volume) = 0; + virtual int setMicrophoneVolume(unsigned int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the volume of the microphone. * @param volume The volume of the microphone. @@ -228,7 +229,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setSpeakerVolume(unsigned int volume) = 0; + virtual int setSpeakerVolume(unsigned int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the volume of the speaker. * @param volume The volume of the speaker. @@ -246,7 +247,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setMicrophoneMute(bool mute) = 0; + virtual int setMicrophoneMute(bool mute, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the mute state of the microphone. * @param mute The mute state of the microphone. @@ -264,7 +265,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setSpeakerMute(bool mute) = 0; + virtual int setSpeakerMute(bool mute, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the mute state of the speaker. * @param mute A reference to the mute state of the speaker. @@ -308,7 +309,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setDefaultAudioRouting(AudioRoute route) = 0; + virtual int setDefaultAudioRouting(AudioRoute route, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Changes the current audio routing. * @@ -320,7 +321,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int changeAudioRouting(AudioRoute route) = 0; + virtual int changeAudioRouting(AudioRoute route, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Changes the speaker status on/off. * @@ -332,7 +333,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setAudioRoutingSpeakerOn(bool enable) = 0; + virtual int setAudioRoutingSpeakerOn(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current audio routing. * @@ -404,7 +405,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setPlayoutDevice(int index) = 0; + virtual int setPlayoutDevice(int index, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the recording device. * @@ -416,7 +417,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setRecordingDevice(int index) = 0; + virtual int setRecordingDevice(int index, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** The status of following system default playback device. @note The status of following system default playback device. @@ -428,7 +429,7 @@ class INGAudioDeviceManager : public RefCountInterface { - 0: Success. - < 0: Failure. */ - virtual int followSystemPlaybackDevice(bool enable) = 0; + virtual int followSystemPlaybackDevice(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** The status of following system default recording device. @@ -441,7 +442,7 @@ class INGAudioDeviceManager : public RefCountInterface { - 0: Success. - < 0: Failure. */ - virtual int followSystemRecordingDevice(bool enable) = 0; + virtual int followSystemRecordingDevice(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; #endif // _WIN32 || (TARGET_OS_MAC && !TARGET_OS_IPHONE) #if defined(_WIN32) @@ -456,7 +457,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setApplicationVolume(unsigned int volume) = 0; + virtual int setApplicationVolume(unsigned int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the volume of the app. * @@ -482,7 +483,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setApplicationMuteState(bool mute) = 0; + virtual int setApplicationMuteState(bool mute, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the mute state of the app. * @@ -517,7 +518,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setLoopbackDevice(int index) = 0; + virtual int setLoopbackDevice(int index, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** The status of following system default loopback device. @note The status of following system default loopback device. @@ -529,7 +530,7 @@ class INGAudioDeviceManager : public RefCountInterface { - 0: Success. - < 0: Failure. */ - virtual int followSystemLoopbackDevice(bool enable) = 0; + virtual int followSystemLoopbackDevice(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; #endif // _WIN32 /** @@ -543,7 +544,7 @@ class INGAudioDeviceManager : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int registerObserver(IAudioDeviceManagerObserver* observer, void(*safeDeleter)(IAudioDeviceManagerObserver*) = NULL) = 0; + virtual int registerObserver(IAudioDeviceManagerObserver* observer, void(*safeDeleter)(IAudioDeviceManagerObserver*) = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the IAudioDeviceManagerObserver object. * @param observer The pointer to the IAudioDeviceManagerObserver class registered using #registerObserver. @@ -553,7 +554,7 @@ class INGAudioDeviceManager : public RefCountInterface { */ virtual int unregisterObserver(IAudioDeviceManagerObserver* observer) = 0; - virtual int setupAudioAttributeContext(void* audioAttr) = 0; + virtual int setupAudioAttributeContext(void* audioAttr, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; protected: ~INGAudioDeviceManager() {} diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioTrack.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioTrack.h index e6adb3e75..1d24933ed 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioTrack.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraAudioTrack.h @@ -9,6 +9,7 @@ #pragma once // NOLINT(build/header_guard) #include "AgoraBase.h" +#include // FIXME(Ender): use this class instead of AudioSendStream as local track namespace agora { @@ -36,6 +37,7 @@ struct AudioSinkWants { channels(0) {} AudioSinkWants(int sampleRate, size_t chs) : samplesPerSec(sampleRate), channels(chs) {} + AudioSinkWants(int sampleRate, size_t chs, int trackNum) : samplesPerSec(sampleRate), channels(chs) {} }; /** @@ -85,7 +87,7 @@ class IAudioTrack : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int adjustPlayoutVolume(int volume) = 0; + virtual int adjustPlayoutVolume(int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current playback volume. @@ -106,7 +108,7 @@ class IAudioTrack : public RefCountInterface { * - `true`: Success. * - `false`: Failure. */ - virtual bool addAudioFilter(agora_refptr filter, AudioFilterPosition position) = 0; + virtual bool addAudioFilter(agora_refptr filter, AudioFilterPosition position, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Removes the audio filter added by callling `addAudioFilter`. * @@ -116,7 +118,7 @@ class IAudioTrack : public RefCountInterface { * - `true`: Success. * - `false`: Failure. */ - virtual bool removeAudioFilter(agora_refptr filter, AudioFilterPosition position) = 0; + virtual bool removeAudioFilter(agora_refptr filter, AudioFilterPosition position, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Enable / Disable specified audio filter @@ -127,7 +129,7 @@ class IAudioTrack : public RefCountInterface { * - 0: success * - <0: failure */ - virtual int enableAudioFilter(const char* id, bool enable, AudioFilterPosition position) { + virtual int enableAudioFilter(const char* id, bool enable, AudioFilterPosition position, ahpl_ref_t ares = AHPL_REF_INVALID) { (void)id; (void)enable; (void)position; @@ -144,7 +146,7 @@ class IAudioTrack : public RefCountInterface { * - 0: success * - <0: failure */ - virtual int setFilterProperty(const char* id, const char* key, const char* jsonValue, AudioFilterPosition position) { + virtual int setFilterProperty(const char* id, const char* key, const char* jsonValue, AudioFilterPosition position, ahpl_ref_t ares = AHPL_REF_INVALID) { (void)id; (void)key; (void)jsonValue; @@ -192,7 +194,7 @@ class IAudioTrack : public RefCountInterface { * - `true`: Success. * - `false`: Failure. */ - virtual bool addAudioSink(agora_refptr sink, const AudioSinkWants& wants) = 0; + virtual bool addAudioSink(agora_refptr sink, const AudioSinkWants& wants, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Removes an audio sink. @@ -202,7 +204,7 @@ class IAudioTrack : public RefCountInterface { * - `true`: Success. * - `false`: Failure. */ - virtual bool removeAudioSink(agora_refptr sink) = 0; + virtual bool removeAudioSink(agora_refptr sink, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; }; /** @@ -216,10 +218,10 @@ class ILocalAudioTrackObserver { * Occurs when the state of a local audio track changes. * * @param state The state of the local audio track. - * @param errorCode The error information for a state failure: \ref agora::rtc::LOCAL_AUDIO_STREAM_ERROR "LOCAL_AUDIO_STREAM_ERROR". + * @param reasonCode The error information for a state failure: \ref agora::rtc::LOCAL_AUDIO_STREAM_REASON "LOCAL_AUDIO_STREAM_REASON". */ virtual void onLocalAudioTrackStateChanged(LOCAL_AUDIO_STREAM_STATE state, - LOCAL_AUDIO_STREAM_ERROR errorCode) = 0; + LOCAL_AUDIO_STREAM_REASON reasonCode) = 0; }; /** @@ -313,7 +315,7 @@ class ILocalAudioTrack : public IAudioTrack { * - `true`: Enable the local audio track. * - `false`: Disable the local audio track. */ - virtual void setEnabled(bool enable) = 0; + virtual int setEnabled(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets whether the local audio track is enabled. @@ -343,7 +345,7 @@ class ILocalAudioTrack : public IAudioTrack { * - 0: Success. * - < 0: Failure. */ - virtual int adjustPublishVolume(int volume) = 0; + virtual int adjustPublishVolume(int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current volume for publishing. @@ -366,7 +368,7 @@ class ILocalAudioTrack : public IAudioTrack { * - 0: Success. * - < 0: Failure. */ - virtual int enableLocalPlayback(bool enable, bool sync = true) = 0; + virtual int enableLocalPlayback(bool enable, bool sync = true, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Enables in-ear monitoring (for Android and iOS only). @@ -379,7 +381,7 @@ class ILocalAudioTrack : public IAudioTrack { * - 0: Success. * - < 0: Failure. */ - virtual int enableEarMonitor(bool enable, int includeAudioFilters) = 0; + virtual int enableEarMonitor(bool enable, int includeAudioFilters, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Register an local audio track observer * * @param observer A pointer to the local audio track observer: \ref agora::rtc::ILocalAudioTrackObserver @@ -388,7 +390,7 @@ class ILocalAudioTrack : public IAudioTrack { * - 0: Success. * - < 0: Failure. */ - virtual int registerTrackObserver(ILocalAudioTrackObserver* observer) = 0; + virtual int registerTrackObserver(ILocalAudioTrackObserver* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Releases the local audio track observer * * @param observer A pointer to the local audio track observer: \ref agora::rtc::ILocalAudioTrackObserver @@ -554,6 +556,10 @@ struct RemoteAudioTrackStats { * Duration of inbandfec */ int32_t fec_decode_ms; + /** + * The count of 10 ms frozen in 2 seconds + */ + uint16_t frozen_count_10_ms; /** * The total time (ms) when the remote user neither stops sending the audio * stream nor disables the audio module after joining the channel. @@ -616,6 +622,7 @@ struct RemoteAudioTrackStats { frozen_rate_by_custom_plc_count(0), plc_count(0), fec_decode_ms(-1), + frozen_count_10_ms(0), total_active_time(0), publish_duration(0), e2e_delay_ms(0), @@ -656,7 +663,7 @@ class IRemoteAudioTrack : public IAudioTrack { * - 0: Success. * - < 0: Failure. */ - virtual int registerMediaPacketReceiver(IMediaPacketReceiver* packetReceiver) = 0; + virtual int registerMediaPacketReceiver(IMediaPacketReceiver* packetReceiver, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the `IMediaPacketReceiver` object. @@ -679,7 +686,7 @@ class IRemoteAudioTrack : public IAudioTrack { * - 0: Success. * - < 0: Failure. */ - virtual int registerAudioEncodedFrameReceiver(IAudioEncodedFrameReceiver* packetReceiver) = 0; + virtual int registerAudioEncodedFrameReceiver(IAudioEncodedFrameReceiver* packetReceiver, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the `IAudioEncodedFrameReceiver` object. @@ -702,7 +709,7 @@ class IRemoteAudioTrack : public IAudioTrack { - 0: Success. - < 0: Failure. */ - virtual int setRemoteVoicePosition(float pan, float gain) = 0; + virtual int setRemoteVoicePosition(float pan, float gain, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** set percentage of audio acceleration during poor network @@ -717,7 +724,7 @@ class IRemoteAudioTrack : public IAudioTrack { - 0: Success. - < 0: Failure. */ - virtual int adjustAudioAcceleration(int percentage) = 0; + virtual int adjustAudioAcceleration(int percentage, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** set percentage of audio deceleration during poor network @@ -732,7 +739,7 @@ class IRemoteAudioTrack : public IAudioTrack { - 0: Success. - < 0: Failure. */ - virtual int adjustAudioDeceleration(int percentage) = 0; + virtual int adjustAudioDeceleration(int percentage, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** enable spatial audio @@ -743,7 +750,7 @@ class IRemoteAudioTrack : public IAudioTrack { - 0: Success. - < 0: Failure. */ - virtual int enableSpatialAudio(bool enabled) = 0; + virtual int enableSpatialAudio(bool enabled, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Sets remote user parameters for spatial audio @@ -753,7 +760,7 @@ class IRemoteAudioTrack : public IAudioTrack { - 0: Success. - < 0: Failure. */ - virtual int setRemoteUserSpatialAudioParams(const agora::SpatialAudioParams& params) = 0; + virtual int setRemoteUserSpatialAudioParams(const agora::SpatialAudioParams& params, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; }; } // namespace rtc diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraCameraCapturer.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraCameraCapturer.h index 4101e9e07..09da1d422 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraCameraCapturer.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraCameraCapturer.h @@ -8,6 +8,7 @@ #include "AgoraBase.h" #include "AgoraRefPtr.h" +#include namespace agora { namespace rtc { @@ -120,7 +121,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setCameraSource(CAMERA_SOURCE source) = 0; + virtual int setCameraSource(CAMERA_SOURCE source, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the camera source. * @@ -136,7 +137,7 @@ class ICameraCapturer : public RefCountInterface { * @note * This method applies to Android and iOS only. */ - virtual void switchCamera() = 0; + virtual int switchCamera(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Returns whether zooming is supported by the current device. * @note @@ -160,7 +161,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int32_t setCameraZoom(float zoomValue) = 0; + virtual int32_t setCameraZoom(float zoomValue, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the max zooming factor of the device. * @@ -191,7 +192,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int32_t setCameraFocus(float x, float y) = 0; + virtual int32_t setCameraFocus(float x, float y, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Returns whether auto face focus is supported by the current device. * @note @@ -213,7 +214,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int32_t setCameraAutoFaceFocus(bool enable) = 0; + virtual int32_t setCameraAutoFaceFocus(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Enables or disables auto face detection. * @note @@ -224,7 +225,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int32_t enableFaceDetection(bool enable) = 0; + virtual int32_t enableFaceDetection(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Checks whether the camera face detect is supported. @@ -273,7 +274,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success * - < 0: Failure */ - virtual int setCameraTorchOn(bool isOn) = 0; + virtual int setCameraTorchOn(bool on, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Checks whether the camera exposure function is supported. * @@ -303,7 +304,7 @@ class ICameraCapturer : public RefCountInterface { *
  • < 0: Failure.
  • * */ - virtual int setCameraExposurePosition(float positionXinView, float positionYinView) = 0; + virtual int setCameraExposurePosition(float positionXinView, float positionYinView, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Returns whether exposure value adjusting is supported by the current device. @@ -329,7 +330,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setCameraExposureFactor(float value) = 0; + virtual int setCameraExposureFactor(float value, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; #if (defined(__APPLE__) && TARGET_OS_IOS) /** @@ -343,7 +344,7 @@ class ICameraCapturer : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual bool enableMultiCamera(bool enable) = 0; + virtual bool enableMultiCamera(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Checks whether the camera auto exposure function is supported. * @@ -367,7 +368,7 @@ class ICameraCapturer : public RefCountInterface { *
  • < 0: Failure.
  • * */ - virtual int setCameraAutoExposureFaceModeEnabled(bool enabled) = 0; + virtual int setCameraAutoExposureFaceModeEnabled(bool enabled, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; #endif #elif defined(_WIN32) || (defined(__linux__) && !defined(__ANDROID__)) || \ @@ -413,7 +414,7 @@ class ICameraCapturer : public RefCountInterface { * Set the device orientation of the capture device * @param VIDEO_ORIENTATION orientaion of the device 0(by default), 90, 180, 270 */ - virtual void setDeviceOrientation(VIDEO_ORIENTATION orientation) = 0; + virtual int setDeviceOrientation(VIDEO_ORIENTATION orientation, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the format of the video captured by the camera. @@ -422,7 +423,7 @@ class ICameraCapturer : public RefCountInterface { * * @param capture_format The reference to the video format: VideoFormat. */ - virtual void setCaptureFormat(const VideoFormat& capture_format) = 0; + virtual int setCaptureFormat(const VideoFormat& capture_format, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the format of the video captured by the camera. * @return @@ -434,7 +435,7 @@ class ICameraCapturer : public RefCountInterface { * * @param observer Instance of the capture observer. */ - virtual int registerCameraObserver(ICameraCaptureObserver* observer) = 0; + virtual int registerCameraObserver(ICameraCaptureObserver* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Unregisters the camera observer. * diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraDataChannel.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraDataChannel.h index 02b4983f8..a79b2c79e 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraDataChannel.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraDataChannel.h @@ -9,6 +9,7 @@ #include "AgoraRefPtr.h" #include "AgoraBase.h" +#include namespace agora { /** @@ -36,10 +37,16 @@ struct DataChannelConfig { int compressionLength; // optional Optional channelId; // 0~7 + + /** + * The priority + */ + int32_t priority; DataChannelConfig() : syncWithMedia(false), ordered(false), - compressionLength(0) {} + compressionLength(0), + priority(-1) {} }; /** @@ -85,7 +92,7 @@ class ILocalDataChannel : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int sendDataPacket(const char* packet, size_t length) = 0; + virtual int sendDataPacket(const char* packet, size_t length, uint64_t capture_time_ms, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Send meta data to this data channel before publishing. * @@ -95,7 +102,7 @@ class ILocalDataChannel : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int setMetaData(const char* metaData, size_t length) = 0; + virtual int setMetaData(const char* metaData, size_t length, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * return configured channel id diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraExtensionControl.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraExtensionControl.h index 5b27ffe99..da0c708df 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraExtensionControl.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraExtensionControl.h @@ -9,7 +9,6 @@ #pragma once // NOLINT(build/header_guard) #include "AgoraBase.h" #include "AgoraRefPtr.h" -#include "AgoraRefCountedObject.h" #include "IAgoraLog.h" #include "NGIAgoraVideoFrame.h" #include "NGIAgoraExtensionProvider.h" @@ -50,7 +49,7 @@ class IExtensionControl { * you can still call this method to perform an immediate memory recycle. * @param type Frame type to be recycled. */ - virtual void recycleVideoCache() = 0; + virtual int recycleVideoCache() = 0; /** * This method dumps the content of the video frame to the specified file. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraLocalUser.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraLocalUser.h index 3f6e1d2cc..8a9d885bb 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraLocalUser.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraLocalUser.h @@ -11,6 +11,7 @@ #include #include "AgoraBase.h" #include "AgoraOptional.h" +#include namespace agora { namespace media { @@ -279,7 +280,7 @@ class ILocalUser { * as `role`, the connection fails with the \ref IRtcConnectionObserver::onConnectionFailure "onConnectionFailure" callback. * @param role The role of the user. See \ref rtc::CLIENT_ROLE_TYPE "CLIENT_ROLE_TYPE". */ - virtual void setUserRole(rtc::CLIENT_ROLE_TYPE role) = 0; + virtual int setUserRole(rtc::CLIENT_ROLE_TYPE role, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the role of the user. @@ -288,8 +289,14 @@ class ILocalUser { */ virtual CLIENT_ROLE_TYPE getUserRole() = 0; - - virtual void setAudienceLatencyLevel(AUDIENCE_LATENCY_LEVEL_TYPE level) = 0; + /** + * Sets the latency level of an audience member. + * + * @note + * @param level The latency level of an audience member in interactive live streaming. See AUDIENCE_LATENCY_LEVEL_TYPE. + * @param role The user role determined by the config. If it's -1, it means there is no configured role. + */ + virtual int setAudienceLatencyLevel(AUDIENCE_LATENCY_LEVEL_TYPE level, int role, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; virtual AUDIENCE_LATENCY_LEVEL_TYPE getAudienceLatencyLevel() = 0; @@ -303,7 +310,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setAudioEncoderConfiguration(const rtc::AudioEncoderConfiguration& config) = 0; + virtual int setAudioEncoderConfiguration(const rtc::AudioEncoderConfiguration& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the audio parameters and application scenarios. @@ -314,7 +321,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setAudioScenario(AUDIO_SCENARIO_TYPE scenario) = 0; + virtual int setAudioScenario(AUDIO_SCENARIO_TYPE scenario, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * You can call this method to set the expected video scenario. @@ -326,7 +333,22 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setVideoScenario(VIDEO_APPLICATION_SCENARIO_TYPE scenarioType) = 0; + virtual int setVideoScenario(VIDEO_APPLICATION_SCENARIO_TYPE scenarioType, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; + + /** + * Sets the Video qoe preference. + * + * You can call this method to set the expected QoE Preference. + * The SDK will optimize the video experience for each preference you set. + * + * + * @param qoePreference The qoe preference type. See #VIDEO_QOE_PREFERENCE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVideoQoEPreference(VIDEO_QOE_PREFERENCE_TYPE qoePreference, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the detailed statistics of the local audio. @@ -349,7 +371,7 @@ class ILocalUser { * - < 0: Failure. * - -5(ERR_REFUSED), if the role of the local user is not broadcaster. */ - virtual int publishAudio(agora_refptr audioTrack) = 0; + virtual int publishAudio(agora_refptr audioTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops publishing the local audio track to the channel. @@ -359,7 +381,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unpublishAudio(agora_refptr audioTrack) = 0; + virtual int unpublishAudio(agora_refptr audioTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Publishes a local video track to the channel. @@ -369,7 +391,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int publishVideo(agora_refptr videoTrack) = 0; + virtual int publishVideo(agora_refptr videoTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops publishing the local video track to the channel. @@ -379,7 +401,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unpublishVideo(agora_refptr videoTrack) = 0; + virtual int unpublishVideo(agora_refptr videoTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Subscribes to the audio of a specified remote user in channel. @@ -390,7 +412,7 @@ class ILocalUser { * - < 0: Failure. * - -2(ERR_INVALID_ARGUMENT), if no such user exists or `userId` is invalid. */ - virtual int subscribeAudio(user_id_t userId) = 0; + virtual int subscribeAudio(user_id_t userId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Subscribes to the audio of all remote users in the channel. @@ -401,7 +423,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int subscribeAllAudio() = 0; + virtual int subscribeAllAudio(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops subscribing to the audio of a specified remote user in the channel. @@ -412,7 +434,7 @@ class ILocalUser { * - < 0: Failure. * - -2(ERR_INVALID_ARGUMENT), if no such user exists or `userId` is invalid. */ - virtual int unsubscribeAudio(user_id_t userId) = 0; + virtual int unsubscribeAudio(user_id_t userId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops subscribing to the audio of all remote users in the channel. @@ -424,7 +446,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unsubscribeAllAudio() = 0; + virtual int unsubscribeAllAudio(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Adjusts the playback signal volume. @@ -436,7 +458,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int adjustPlaybackSignalVolume(int volume) = 0; + virtual int adjustPlaybackSignalVolume(int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current playback signal volume. @@ -467,7 +489,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int adjustUserPlaybackSignalVolume(user_id_t userId, int volume) = 0; + virtual int adjustUserPlaybackSignalVolume(user_id_t userId, int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current playback signal volume of specified user. @@ -491,7 +513,7 @@ class ILocalUser { - 0: Success. - < 0: Failure. */ - virtual int enableSoundPositionIndication(bool enabled) = 0; + virtual int enableSoundPositionIndication(bool enabled, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Sets the sound position and gain of a remote user. @@ -513,7 +535,7 @@ class ILocalUser { - 0: Success. - < 0: Failure. */ - virtual int setRemoteVoicePosition(user_id_t userId, double pan, double gain) = 0; + virtual int setRemoteVoicePosition(user_id_t userId, double pan, double gain, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** enable spatial audio @@ -524,7 +546,7 @@ class ILocalUser { - 0: Success. - < 0: Failure. */ - virtual int enableSpatialAudio(bool enabled) = 0; + virtual int enableSpatialAudio(bool enabled, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Sets remote user parameters for spatial audio @@ -535,7 +557,7 @@ class ILocalUser { - 0: Success. - < 0: Failure. */ - virtual int setRemoteUserSpatialAudioParams(user_id_t userId, const agora::SpatialAudioParams& param) = 0; + virtual int setRemoteUserSpatialAudioParams(user_id_t userId, const agora::SpatialAudioParams& param, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the audio frame parameters for the \ref agora::media::IAudioFrameObserver::onPlaybackAudioFrame @@ -556,7 +578,7 @@ class ILocalUser { virtual int setPlaybackAudioFrameParameters(size_t numberOfChannels, uint32_t sampleRateHz, RAW_AUDIO_FRAME_OP_MODE_TYPE mode = RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, - int samplesPerCall = 0) = 0; + int samplesPerCall = 0, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the audio frame parameters for the \ref agora::media::IAudioFrameObserver::onRecordAudioFrame * "onRecordAudioFrame" callback. @@ -576,7 +598,7 @@ class ILocalUser { virtual int setRecordingAudioFrameParameters(size_t numberOfChannels, uint32_t sampleRateHz, RAW_AUDIO_FRAME_OP_MODE_TYPE mode = RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, - int samplesPerCall = 0) = 0; + int samplesPerCall = 0, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the audio frame parameters for the \ref agora::media::IAudioFrameObserver::onMixedAudioFrame * "onMixedAudioFrame" callback. @@ -593,7 +615,7 @@ class ILocalUser { */ virtual int setMixedAudioFrameParameters(size_t numberOfChannels, uint32_t sampleRateHz, - int samplesPerCall = 0) = 0; + int samplesPerCall = 0, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the audio frame parameters for the \ref agora::media::IAudioFrameObserver::onEarMonitoringAudioFrame @@ -617,7 +639,7 @@ class ILocalUser { size_t numberOfChannels, uint32_t sampleRateHz, RAW_AUDIO_FRAME_OP_MODE_TYPE mode = RAW_AUDIO_FRAME_OP_MODE_READ_ONLY, - int samplesPerCall = 0) = 0; + int samplesPerCall = 0, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the audio frame parameters for the \ref agora::media::IAudioFrameObserver::onPlaybackAudioFrameBeforeMixing @@ -634,7 +656,7 @@ class ILocalUser { * - < 0: Failure. */ virtual int setPlaybackAudioFrameBeforeMixingParameters(size_t numberOfChannels, - uint32_t sampleRateHz) = 0; + uint32_t sampleRateHz, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers an audio frame observer. @@ -652,7 +674,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerAudioFrameObserver(agora::media::IAudioFrameObserverBase * observer) = 0; + virtual int registerAudioFrameObserver(agora::media::IAudioFrameObserverBase* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the audio frame observer. * @@ -661,7 +683,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unregisterAudioFrameObserver(agora::media::IAudioFrameObserverBase * observer) = 0; + virtual int unregisterAudioFrameObserver(agora::media::IAudioFrameObserverBase* observer) = 0; /** * Enable the audio spectrum monitor. @@ -673,7 +695,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int enableAudioSpectrumMonitor(int intervalInMS = 100) = 0; + virtual int enableAudioSpectrumMonitor(int intervalInMS = 100, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Disalbe the audio spectrum monitor. * @@ -681,7 +703,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int disableAudioSpectrumMonitor() = 0; + virtual int disableAudioSpectrumMonitor(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers an audio spectrum observer. @@ -697,7 +719,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerAudioSpectrumObserver(agora::media::IAudioSpectrumObserver * observer, void (*safeDeleter)(agora::media::IAudioSpectrumObserver*)) = 0; + virtual int registerAudioSpectrumObserver(agora::media::IAudioSpectrumObserver * observer, void (*safeDeleter)(agora::media::IAudioSpectrumObserver*), ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the audio spectrum observer. * @@ -721,7 +743,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerLocalVideoEncodedFrameObserver(agora::media::IVideoEncodedFrameObserver* observer) = 0; + virtual int registerLocalVideoEncodedFrameObserver(agora::media::IVideoEncodedFrameObserver* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::media::IVideoEncodedFrameObserver "IVideoEncodedFrameObserver" object. * @param observer The pointer to the `IVideoEncodedFrameObserver` object. @@ -737,7 +759,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int forceNextIntraFrame() = 0; + virtual int forceNextIntraFrame(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers an \ref agora::media::IVideoEncodedFrameObserver "IVideoEncodedFrameObserver" object. * @@ -750,7 +772,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerVideoEncodedFrameObserver(agora::media::IVideoEncodedFrameObserver* observer) = 0; + virtual int registerVideoEncodedFrameObserver(agora::media::IVideoEncodedFrameObserver* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::media::IVideoEncodedFrameObserver "IVideoEncodedFrameObserver" object. * @param observer The pointer to the `IVideoEncodedFrameObserver` object. @@ -772,7 +794,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerVideoFrameObserver(IVideoFrameObserver2* observer) = 0; + virtual int registerVideoFrameObserver(IVideoFrameObserver2* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::rtc::IVideoFrameObserver2 "IVideoFrameObserver2" object. * @param observer The pointer to the `IVideoFrameObserver2` object. @@ -783,9 +805,9 @@ class ILocalUser { virtual int unregisterVideoFrameObserver(IVideoFrameObserver2* observer) = 0; virtual int setVideoSubscriptionOptions(user_id_t userId, - const VideoSubscriptionOptions& options) = 0; + const VideoSubscriptionOptions& options, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; - virtual int setHighPriorityUserList(uid_t* vipList, int uidNum, int option) = 0; + virtual int setHighPriorityUserList(uid_t* vipList, int uidNum, int option, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; virtual int getHighPriorityUserList(std::vector& vipList, int& option) = 0; @@ -803,7 +825,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setSubscribeAudioBlocklist(user_id_t* userList, int userNumber) = 0; + virtual int setSubscribeAudioBlocklist(user_id_t* userList, int userNumber, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the allowlist of subscribe remote stream audio. @@ -821,7 +843,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setSubscribeAudioAllowlist(user_id_t* userList, int userNumber) = 0; + virtual int setSubscribeAudioAllowlist(user_id_t* userList, int userNumber, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the blocklist of subscribe remote stream video. @@ -837,7 +859,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setSubscribeVideoBlocklist(user_id_t* userList, int userNumber) = 0; + virtual int setSubscribeVideoBlocklist(user_id_t* userList, int userNumber, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the allowlist of subscribe remote stream video. @@ -855,7 +877,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setSubscribeVideoAllowlist(user_id_t* userList, int userNumber) = 0; + virtual int setSubscribeVideoAllowlist(user_id_t* userList, int userNumber, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Subscribes to the video of a specified remote user in the channel. @@ -870,7 +892,7 @@ class ILocalUser { * - -2(ERR_INVALID_ARGUMENT), if `userId` is invalid. */ virtual int subscribeVideo(user_id_t userId, - const VideoSubscriptionOptions &subscriptionOptions) = 0; + const VideoSubscriptionOptions &subscriptionOptions, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Subscribes to the video of all remote users in the channel. @@ -882,7 +904,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int subscribeAllVideo(const VideoSubscriptionOptions &subscriptionOptions) = 0; + virtual int subscribeAllVideo(const VideoSubscriptionOptions &subscriptionOptions, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops subscribing to the video of a specified remote user in the channel. @@ -893,7 +915,7 @@ class ILocalUser { * - < 0: Failure. * - -2(ERR_INVALID_ARGUMENT), if `userId` is invalid. */ - virtual int unsubscribeVideo(user_id_t userId) = 0; + virtual int unsubscribeVideo(user_id_t userId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops subscribing to the video of all remote users in the channel. @@ -905,7 +927,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unsubscribeAllVideo() = 0; + virtual int unsubscribeAllVideo(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the time interval and the volume smoothing factor of the \ref agora::rtc::ILocalUserObserver::onAudioVolumeIndication "onAudioVolumeIndication" callback. @@ -926,7 +948,7 @@ class ILocalUser { * - < 0: Failure. * - -2(ERR_INVALID_ARGUMENT), if `intervalInMS` or `smooth` is out of range. */ - virtual int setAudioVolumeIndicationParameters(int intervalInMS, int smooth, bool reportVad) = 0; + virtual int setAudioVolumeIndicationParameters(int intervalInMS, int smooth, bool reportVad, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers a local user observer object. @@ -941,7 +963,7 @@ class ILocalUser { */ virtual int registerLocalUserObserver( ILocalUserObserver* observer, - void(*safeDeleter)(ILocalUserObserver*) = NULL) = 0; + void(*safeDeleter)(ILocalUserObserver*) = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::rtc::ILocalUserObserver "ILocalUserObserver" object. @@ -974,7 +996,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerMediaControlPacketReceiver(IMediaControlPacketReceiver* ctrlPacketReceiver) = 0; + virtual int registerMediaControlPacketReceiver(IMediaControlPacketReceiver* ctrlPacketReceiver, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the media control packet receiver. @@ -995,7 +1017,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int sendIntraRequest(user_id_t userId) = 0; + virtual int sendIntraRequest(user_id_t userId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set local audio filterable by topn @@ -1008,7 +1030,7 @@ class ILocalUser { * - < 0: Failure. */ - virtual int setAudioFilterable(bool filterable) = 0; + virtual int setAudioFilterable(bool filterable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Enable / Disable specified audio filter @@ -1019,7 +1041,7 @@ class ILocalUser { * - 0: success * - <0: failure */ - virtual int enableRemoteAudioTrackFilter(user_id_t userId, const char* id, bool enable) = 0; + virtual int enableRemoteAudioTrackFilter(user_id_t userId, const char* id, bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * set the properties of the specified audio filter @@ -1031,7 +1053,7 @@ class ILocalUser { * - 0: success * - <0: failure */ - virtual int setRemoteAudioTrackFilterProperty(user_id_t userId, const char* id, const char* key, const char* jsonValue) = 0; + virtual int setRemoteAudioTrackFilterProperty(user_id_t userId, const char* id, const char* key, const char* jsonValue, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * get the properties of the specified audio filter @@ -1053,7 +1075,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int publishDataChannel(agora_refptr channel) = 0; + virtual int publishDataChannel(agora_refptr channel, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops publishing the data channel to the channel. * @@ -1062,7 +1084,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unpublishDataChannel(agora_refptr channel) = 0; + virtual int unpublishDataChannel(agora_refptr channel, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Subscribes to a specified data channel of a specified remote user in channel. * @@ -1072,7 +1094,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int subscribeDataChannel(user_id_t userId, int channelId) = 0; + virtual int subscribeDataChannel(user_id_t userId, int channelId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops subscribing to the data channel of a specified remote user in the channel. * @@ -1083,8 +1105,7 @@ class ILocalUser { * - < 0: Failure. * - -2(ERR_INVALID_ARGUMENT), if no such user exists or `userId` is invalid. */ - - virtual int unsubscribeDataChannel(user_id_t userId, int channelId) = 0; + virtual int unsubscribeDataChannel(user_id_t userId, int channelId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers an data channel observer. * @@ -1095,7 +1116,7 @@ class ILocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerDataChannelObserver(IDataChannelObserver * observer) = 0; + virtual int registerDataChannelObserver(IDataChannelObserver * observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the data channel observer. * @@ -1116,7 +1137,27 @@ class ILocalUser { * - 0: success * - <0: failure */ - virtual int SetAudioNsMode(bool NsEnable, NS_MODE NsMode, NS_LEVEL NsLevel, NS_DELAY NsDelay) = 0; + virtual int SetAudioNsMode(bool NsEnable, NS_MODE NsMode, NS_LEVEL NsLevel, NS_DELAY NsDelay, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; + /** + * enable the mix track that mix special track + * + * @param track The special mixed audio track. + * @param enalble Action of start mixing this user's audio. + * @param MixLocal Mix publish stream. + * @param MixRemote Mix remote stream. + * @return + * - 0: success + * - <0: failure + */ + virtual int EnableLocalMixedAudioTrack(agora_refptr& track, bool enable, bool MixLocal, bool MixRemote, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; + /** + * Trigger data channel update callback with all data channel infos. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int takeDataChannelSnapshot(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; }; /** @@ -1240,11 +1281,11 @@ class ILocalUserObserver { * * @param videoTrack The pointer to `ILocalVideoTrack`. * @param state The state of the local video track. - * @param errorCode The error information. + * @param reason The error information. */ virtual void onLocalVideoTrackStateChanged(agora_refptr videoTrack, LOCAL_VIDEO_STREAM_STATE state, - LOCAL_VIDEO_STREAM_ERROR errorCode) = 0; + LOCAL_VIDEO_STREAM_REASON reason) = 0; /** * Reports the statistics of a local video track. @@ -1265,7 +1306,7 @@ class ILocalUserObserver { * @param trackInfo The information of the remote video track. * @param videoTrack The pointer to `IRemoteVideoTrack`. */ - virtual void onUserVideoTrackSubscribed(user_id_t userId, VideoTrackInfo trackInfo, + virtual void onUserVideoTrackSubscribed(user_id_t userId, const VideoTrackInfo& trackInfo, agora_refptr videoTrack) = 0; /** diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNode.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNode.h index 2ac47728e..f74fb00ed 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNode.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNode.h @@ -4,6 +4,7 @@ #include "IAgoraLog.h" #include "NGIAgoraVideoFrame.h" #include "AgoraExtensionVersion.h" +#include #ifndef OPTIONAL_PROCESSRESULT_SPECIFIER #if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) @@ -299,16 +300,14 @@ class IExtensionVideoFilter : public IVideoFilter { }; /** - * @brief SDK will invoke this API first to get the filter's requested process mode @ref ProcessMode and threading model + * @brief SDK will invoke this API first to get the filter's requested process mode @ref ProcessMode * @param mode [out] filter assign its desired the process mode @ref ProcessMode - * @param independent_thread [out] filter assign its desired threading model. When this boolean is set "true", an - * indepent thread will be assigned to the current filter and all invocations from SDK afterwards are ensured to - * happen on that fixed thread. If this boolean flag is set "false", the filter will re-use the thread of the SDK's - * data path. All invocations from SDK afterwards are also ensured to be on the same thread, however that thread is shared. + * @param independent_thread deprecated. SDK will ignore this parameter. * @note If the filter implementation is not thread sensitive, we recommend to set the boolean to "false" to reduce thread context * switching. */ virtual void getProcessMode(ProcessMode& mode, bool& independent_thread) = 0; + /** * @brief SDK will invoke this API before feeding video frame data to the filter. Filter can perform its initialization/preparation job * in this step. @@ -466,7 +465,7 @@ class IAudioPcmDataSender : public RefCountInterface { const size_t samples_per_channel, // for 10ms Data, number_of_samples * 100 = sample_rate const agora::rtc::BYTES_PER_SAMPLE bytes_per_sample, // 2 const size_t number_of_channels, - const uint32_t sample_rate) = 0; // sample_rate > 8000) + const uint32_t sample_rate, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; // sample_rate > 8000) protected: ~IAudioPcmDataSender() {} @@ -494,7 +493,7 @@ class IAudioEncodedFrameSender : public RefCountInterface { * - `false`: Failure. */ virtual bool sendEncodedAudioFrame(const uint8_t* payload_data, size_t payload_size, - const EncodedAudioFrameInfo& audioFrameInfo) = 0; + const EncodedAudioFrameInfo& audioFrameInfo, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; protected: ~IAudioEncodedFrameSender() {} @@ -577,7 +576,7 @@ class IMediaPacketSender : public RefCountInterface { * - `false`: Failure. */ virtual int sendMediaPacket(const uint8_t *packet, size_t length, - const media::base::PacketOptions &options) = 0; + const media::base::PacketOptions &options, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; protected: ~IMediaPacketSender() {} }; @@ -605,7 +604,7 @@ class IMediaControlPacketSender { */ virtual int sendPeerMediaControlPacket(media::base::user_id_t userId, const uint8_t *packet, - size_t length) = 0; + size_t length, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sends the media transport control packet to all users. @@ -617,7 +616,7 @@ class IMediaControlPacketSender { * - `true`: Success. * - `false`: Failure. */ - virtual int sendBroadcastMediaControlPacket(const uint8_t *packet, size_t length) = 0; + virtual int sendBroadcastMediaControlPacket(const uint8_t *packet, size_t length, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; virtual ~IMediaControlPacketSender() {} }; @@ -659,7 +658,7 @@ class IVideoFrameSender : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int sendVideoFrame(const media::base::ExternalVideoFrame& videoFrame) = 0; + virtual int sendVideoFrame(const media::base::ExternalVideoFrame& videoFrame, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; protected: ~IVideoFrameSender() {} @@ -686,7 +685,7 @@ class IVideoEncodedImageSender : public RefCountInterface { * - `false`: Failure. */ virtual bool sendEncodedVideoImage(const uint8_t* imageBuffer, size_t length, - const EncodedVideoFrameInfo& videoEncodedFrameInfo) = 0; + const EncodedVideoFrameInfo& videoEncodedFrameInfo, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; protected: ~IVideoEncodedImageSender() {} @@ -780,7 +779,7 @@ class IVideoRenderer : public IVideoSinkBase { * - 0: Success. * - < 0: Failure. */ - virtual int setRenderMode(media::base::RENDER_MODE_TYPE renderMode) = 0; + virtual int setRenderMode(media::base::RENDER_MODE_TYPE renderMode, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the render mode of the view. * @param view the view to set render mode. @@ -789,7 +788,7 @@ class IVideoRenderer : public IVideoSinkBase { * - 0: Success. * - < 0: Failure. */ - virtual int setRenderMode(void* view, media::base::RENDER_MODE_TYPE renderMode) = 0; + virtual int setRenderMode(void* view, media::base::RENDER_MODE_TYPE renderMode, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets whether to mirror the video. * @param mirror Whether to mirror the video: @@ -799,7 +798,7 @@ class IVideoRenderer : public IVideoSinkBase { * - 0: Success. * - < 0: Failure. */ - virtual int setMirror(bool mirror) = 0; + virtual int setMirror(bool mirror, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets whether to mirror the video. * @param view the view to set mirror mode. @@ -810,7 +809,7 @@ class IVideoRenderer : public IVideoSinkBase { * - 0: Success. * - < 0: Failure. */ - virtual int setMirror(void* view, bool mirror) = 0; + virtual int setMirror(void* view, bool mirror, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the video display window. * @param view The pointer to the video display window. @@ -818,7 +817,7 @@ class IVideoRenderer : public IVideoSinkBase { * - 0: Success. * - < 0: Failure. */ - virtual int setView(void* view) = 0; + virtual int setView(void* view, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the video display window. * @param view The pointer to the video display window. @@ -827,14 +826,14 @@ class IVideoRenderer : public IVideoSinkBase { * - 0: Success. * - < 0: Failure. */ - virtual int addView(void* view, const Rectangle& cropArea) = 0; + virtual int addView(void* view, const Rectangle& cropArea, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops rendering the video view on the window. * @return * - 0: Success. * - < 0: Failure. */ - virtual int unsetView() = 0; + virtual int unsetView(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * remove rendering the video view on the window. * @return @@ -853,8 +852,8 @@ class IVideoTrack; class IVideoFrameTransceiver : public RefCountInterface { public: virtual int getTranscodingDelayMs() = 0; - virtual int addVideoTrack(agora_refptr track) = 0; - virtual int removeVideoTrack(agora_refptr track) = 0; + virtual int addVideoTrack(agora_refptr track, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; + virtual int removeVideoTrack(agora_refptr track, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; }; } diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNodeFactory.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNodeFactory.h index 5a53ce562..fb57b852e 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNodeFactory.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraMediaNodeFactory.h @@ -7,6 +7,7 @@ #pragma once // NOLINT(build/header_guard) #include "AgoraBase.h" +#include namespace agora { namespace rtc { @@ -57,17 +58,6 @@ class IMediaNodeFactory : public RefCountInterface { */ virtual agora_refptr createAudioEncodedFrameSender() = 0; - /** - * Creates a remote audio mixer source object and returns the pointer. - * - * @param type The type of audio mixer source you want to create. - * - * @return - * - The pointer to \ref rtc::IRemoteAudioMixerSource "IRemoteAudioMixerSource", if the method call succeeds. - * - A null pointer, if the method call fails. - */ - virtual agora_refptr createRemoteAudioMixerSource() = 0; - /** * Creates a camera capturer. * @@ -80,6 +70,7 @@ class IMediaNodeFactory : public RefCountInterface { */ virtual agora_refptr createCameraCapturer() = 0; +#if !defined(__ANDROID__) && !(defined(__APPLE__) && TARGET_OS_IPHONE) /** * Creates a screen capturer. * @@ -91,6 +82,7 @@ class IMediaNodeFactory : public RefCountInterface { * - A null pointer: Failure. */ virtual agora_refptr createScreenCapturer() = 0; +#endif /** * Creates a video mixer. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRemoteAudioMixerSource.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRemoteAudioMixerSource.h index e4710711f..12aa0d80d 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRemoteAudioMixerSource.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRemoteAudioMixerSource.h @@ -7,6 +7,7 @@ #pragma once #include "AgoraRefPtr.h" +#include namespace agora { namespace rtc { @@ -26,13 +27,13 @@ class IRemoteAudioMixerSource : public RefCountInterface { * Add a audio track for mixing. Automatically starts mixing if add audio track * @param track The instance of the audio track that you want mixer to receive its audio stream. */ - virtual int addAudioTrack(agora_refptr track) = 0; + virtual int addAudioTrack(agora_refptr track, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Remove a audio track for mixing. Automatically stops the mixed stream if all audio tracks are removed * @param track The instance of the audio track that you want to remove from the mixer. */ - virtual int removeAudioTrack(agora_refptr track) = 0; + virtual int removeAudioTrack(agora_refptr track, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the delay time for mix. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtcConnection.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtcConnection.h index 886b2a257..4085bd1d6 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtcConnection.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtcConnection.h @@ -9,6 +9,7 @@ #include "AgoraBase.h" #include "time_utils.h" +#include namespace agora { namespace rtc { @@ -148,6 +149,11 @@ struct RtcConnectionConfiguration { */ bool isInteractiveAudience; + /** + * Indicates data channel only. + */ + bool isDataChannelOnly; + RtcConnectionConfiguration() : autoSubscribeAudio(true), autoSubscribeVideo(true), @@ -160,7 +166,8 @@ struct RtcConnectionConfiguration { audioRecvEncodedFrame(false), audioRecvMediaPacket(false), videoRecvMediaPacket(false), - isInteractiveAudience(false) {} + isInteractiveAudience(false), + isDataChannelOnly(false) {} }; /** @@ -205,7 +212,7 @@ class IRtcConnection : public RefCountInterface { * - -2(ERR_INVALID_ARGUMENT): The argument that you pass is invalid. * - -8(ERR_INVALID_STATE): The current connection state is not CONNECTION_STATE_DISCONNECTED(1). */ - virtual int connect(const char* token, const char* channelId, user_id_t userId) = 0; + virtual int connect(const char* token, const char* channelId, user_id_t userId, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Connects to an Agora channel. @@ -218,7 +225,7 @@ class IRtcConnection : public RefCountInterface { * The SDK also triggers `onConnected` or `onDisconnected` to notify you of the state change. * @param settings The settings of connecting. */ - virtual int connect(const TConnectSettings& settings) = 0; + virtual int connect(const TConnectSettings& settings, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Disconnects from the Agora channel. @@ -231,7 +238,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int disconnect() = 0; + virtual int disconnect(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Starts the last-mile network probe test. @@ -260,7 +267,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int startLastmileProbeTest(const LastmileProbeConfig& config) = 0; + virtual int startLastmileProbeTest(const LastmileProbeConfig& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops the last-mile network probe test. @@ -268,7 +275,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int stopLastmileProbeTest() = 0; + virtual int stopLastmileProbeTest(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Renews the token. @@ -279,7 +286,7 @@ class IRtcConnection : public RefCountInterface { * * @param token The pointer to the new token. */ - virtual int renewToken(const char* token) = 0; + virtual int renewToken(const char* token, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the connection information. @@ -333,7 +340,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int registerObserver(IRtcConnectionObserver* observer, void(*safeDeleter)(IRtcConnectionObserver*) = NULL) = 0; + virtual int registerObserver(IRtcConnectionObserver* observer, void(*safeDeleter)(IRtcConnectionObserver*) = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the registered IRtcConnectionObserver object. @@ -354,7 +361,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int registerNetworkObserver(INetworkObserver* observer, void(*safeDeleter)(INetworkObserver*) = NULL) = 0; + virtual int registerNetworkObserver(INetworkObserver* observer, void(*safeDeleter)(INetworkObserver*) = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the registered INetworkObserver object. @@ -430,7 +437,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int sendStreamMessage(int streamId, const char* data, size_t length) = 0; + virtual int sendStreamMessage(int streamId, const char* data, size_t length, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Enables/Disables the built-in encryption. * @@ -450,7 +457,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int enableEncryption(bool enabled, const EncryptionConfig& config) = 0; + virtual int enableEncryption(bool enabled, const EncryptionConfig& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Reports a custom event to Agora. @@ -465,7 +472,7 @@ class IRtcConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int sendCustomReportMessage(const char* id, const char* category, const char* event, const char* label, int value) = 0; + virtual int sendCustomReportMessage(const char* id, const char* category, const char* event, const char* label, int value, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** Gets the user information by user account, which is in string format. * * @param userAccount The user account of the user. @@ -685,8 +692,8 @@ class IRtcConnectionObserver { * @param height image height * @param errCode 0 is ok negative is error */ - virtual void onSnapshotTaken(uid_t uid, const char* filePath, int width, int height, int errCode) { - (void)uid; + virtual void onSnapshotTaken(user_id_t userId, const char* filePath, int width, int height, int errCode) { + (void)userId; (void)filePath; (void)width; (void)height; @@ -786,6 +793,15 @@ class IRtcConnectionObserver { (void)reason; } + /** + * Occurs when receive use rtm response. + * + * @param code The error code: + */ + virtual void onSetRtmFlagResult(int code) { + (void)code; + } + /** Occurs when the WIFI message need be sent to the user. * * @param reason The reason of notifying the user of a message. @@ -803,7 +819,7 @@ class IRtcConnectionObserver { * @param currentStats Instantaneous value of optimization effect. * @param averageStats Average value of cumulative optimization effect. */ - virtual void onWlAccStats(WlAccStats currentStats, WlAccStats averageStats) { + virtual void onWlAccStats(const WlAccStats& currentStats, const WlAccStats& averageStats) { (void)currentStats; (void)averageStats; } diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpConnection.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpConnection.h index 98bdfad9c..d0516cb6d 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpConnection.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpConnection.h @@ -9,6 +9,7 @@ #include "AgoraBase.h" #include "AgoraRefPtr.h" +#include namespace agora { namespace rtc { @@ -338,7 +339,7 @@ class IRtmpConnection : public RefCountInterface { * - ERR_INVALID_ARGUMENT: The passed in argument is invalid. * - ERR_INVALID_STATE: The current connection state is not STATE_DISCONNECTED(3). */ - virtual int connect(const char* url) = 0; + virtual int connect(const char* url, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Disconnects from the RTMP server. @@ -347,7 +348,7 @@ class IRtmpConnection : public RefCountInterface { * STATE_DISCONNECTED(4). You will be notified with the callback * \ref onDisconnected "onDisconnected". */ - virtual int disconnect() = 0; + virtual int disconnect(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current connection information. @@ -375,7 +376,7 @@ class IRtmpConnection : public RefCountInterface { * - 0: Success. * - < 0: Failure. */ - virtual int registerObserver(IRtmpConnectionObserver* observer, void(*safeDeleter)(IRtmpConnectionObserver*) = NULL) = 0; + virtual int registerObserver(IRtmpConnectionObserver* observer, void(*safeDeleter)(IRtmpConnectionObserver*) = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the registered IRtmpConnectionObserver object. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpLocalUser.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpLocalUser.h index 22172b769..4b2531f47 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpLocalUser.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraRtmpLocalUser.h @@ -134,7 +134,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setAudioStreamConfiguration(const RtmpStreamingAudioConfiguration& config) = 0; + virtual int setAudioStreamConfiguration(const RtmpStreamingAudioConfiguration& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set the parameters of the video encoder when pushing the stream @@ -145,7 +145,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setVideoStreamConfiguration(const RtmpStreamingVideoConfiguration& config) = 0; + virtual int setVideoStreamConfiguration(const RtmpStreamingVideoConfiguration& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Adjusts the audio volume for publishing. @@ -156,7 +156,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int adjustRecordingSignalVolume(int volume) = 0; + virtual int adjustRecordingSignalVolume(int volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the current volume for publishing. @@ -168,18 +168,6 @@ class IRtmpLocalUser { */ virtual int getRecordingSignalVolume(int32_t* volume) = 0; - /** - * Set whether to enable local audio - * @param enabled Whether to enable local audio: - * - `true`: Enable local audio. - * - `false`: Disable local audio. - * - * @return - * - 0: Success. - * - < 0: Failure. - */ - virtual int setAudioEnabled(bool enabled) = 0; - /** * Dynamically adjust the bit rate parameters of the video encoder in the push stream * @@ -194,7 +182,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual void adjustVideoBitrate(VideoBitrateAdjustType type) = 0; + virtual int adjustVideoBitrate(VideoBitrateAdjustType type, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set whether to enable local video @@ -207,7 +195,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int setVideoEnabled(bool enabled) = 0; + virtual int setVideoEnabled(bool enabled, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Publishes a local audio track to the RTMP connection. @@ -217,7 +205,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int publishAudio(agora_refptr audioTrack) = 0; + virtual int publishAudio(agora_refptr audioTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops publishing the local audio track to the RTMP connection. @@ -227,29 +215,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unpublishAudio(agora_refptr audioTrack) = 0; - - /** - * Publishes a media player local audio track to the RTMP connection. - * - * @param audioTrack The local audio track to be published: ILocalAudioTrack. - * @param playerId The player source ID. - * @return - * - 0: Success. - * - < 0: Failure. - */ - virtual int publishMediaPlayerAudio(agora_refptr audioTrack, int32_t playerId=0) = 0; - - /** - * Stops publishing the media player local audio track to the RTMP connection. - * - * @param audioTrack The local audio track that you want to stop publishing: ILocalAudioTrack. - * @param playerId The player source ID. - * @return - * - 0: Success. - * - < 0: Failure. - */ - virtual int unpublishMediaPlayerAudio(agora_refptr audioTrack, int32_t playerId=0) = 0; + virtual int unpublishAudio(agora_refptr audioTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Publishes a local video track to the RTMP connection. @@ -259,7 +225,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int publishVideo(agora_refptr videoTrack) = 0; + virtual int publishVideo(agora_refptr videoTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Stops publishing the local video track to the RTMP connection. @@ -268,7 +234,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unpublishVideo(agora_refptr videoTrack) = 0; + virtual int unpublishVideo(agora_refptr videoTrack, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Registers a RTMP user observer object. @@ -281,7 +247,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerRtmpUserObserver(IRtmpLocalUserObserver* observer, void(*safeDeleter)(IRtmpLocalUserObserver*) = NULL) = 0; + virtual int registerRtmpUserObserver(IRtmpLocalUserObserver* observer, void(*safeDeleter)(IRtmpLocalUserObserver*) = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the IRtmpLocalUserObserver object previously registered using registerRtmpUserObserver(). @@ -291,7 +257,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int unregisteRtmpUserObserver(IRtmpLocalUserObserver* observer) = 0; + virtual int unregisterRtmpUserObserver(IRtmpLocalUserObserver* observer) = 0; /** * Registers an audio frame observer object. * @@ -301,7 +267,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; + virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Unregisters an audio frame observer object. @@ -319,7 +285,7 @@ class IRtmpLocalUser { * - 0: Success. * - < 0: Failure. */ - virtual int registerVideoFrameObserver(media::base::IVideoFrameObserver* observer) = 0; + virtual int registerVideoFrameObserver(media::base::IVideoFrameObserver* observer, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Unregisters a video frame observer object. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraScreenCapturer.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraScreenCapturer.h index a5e5e1693..ee1633fb6 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraScreenCapturer.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraScreenCapturer.h @@ -8,6 +8,7 @@ #include "AgoraBase.h" #include "AgoraRefPtr.h" +#include namespace agora { namespace rtc { @@ -90,7 +91,7 @@ class IScreenCapturer : public RefCountInterface { * - < 0: Failure. * - ERR_NOT_READY: No screen or window is being shared. */ - virtual int setContentHint(VIDEO_CONTENT_HINT contentHint) = 0; + virtual int setContentHint(VIDEO_CONTENT_HINT contentHint, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Updates the screen capture region. @@ -102,19 +103,19 @@ class IScreenCapturer : public RefCountInterface { * - < 0: Failure. * - No screen or window is being shared. */ - virtual int updateScreenCaptureRegion(const Rectangle& regionRect) = 0; + virtual int updateScreenCaptureRegion(const Rectangle& regionRect, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set orientation of the captured screen image * @param VIDEO_ORIENTATION orientaion of the device 0(by default), 90, 180, 270 */ - virtual void setScreenOrientation(VIDEO_ORIENTATION orientation) = 0; + virtual int setScreenOrientation(VIDEO_ORIENTATION orientation, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set frame rate of the screen capture source * @param rate frame rate (in fps) */ - virtual void setFrameRate(int rate) = 0; + virtual int setFrameRate(int rate, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; #if defined(__ANDROID__) /** @@ -141,7 +142,7 @@ class IScreenCapturer : public RefCountInterface { ~IScreenCapturer() {} }; -#if defined(__ANDROID__) || TARGET_OS_IPHONE +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IPHONE) class IScreenCapturer2 : public RefCountInterface { public: /** @@ -154,7 +155,7 @@ class IScreenCapturer2 : public RefCountInterface { * - < 0: Failure. * - ERR_INVALID_ARGUMENT if data is null. */ - virtual int setScreenCaptureDimensions(const VideoDimensions& dimensions) = 0; + virtual int setScreenCaptureDimensions(const VideoDimensions& dimensions, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Updates the screen capture region. @@ -166,13 +167,13 @@ class IScreenCapturer2 : public RefCountInterface { * - < 0: Failure. * - No screen or window is being shared. */ - virtual int updateScreenCaptureRegion(const Rectangle& regionRect) = 0; + virtual int updateScreenCaptureRegion(const Rectangle& regionRect, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set frame rate of the screen capture source * @param rate frame rate (in fps) */ - virtual int setFrameRate(int rate) = 0; + virtual int setFrameRate(int rate, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set channels and sample rate of screen audio capturing @@ -182,7 +183,7 @@ class IScreenCapturer2 : public RefCountInterface { * - 0: Sucess. * - < 0: Failure */ - virtual int setAudioRecordConfig(int channels, int sampleRate) = 0; + virtual int setAudioRecordConfig(int channels, int sampleRate, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set volume of screen audio capturing @@ -191,7 +192,7 @@ class IScreenCapturer2 : public RefCountInterface { * - 0: Sucess. * - < 0: Failure */ - virtual int setAudioVolume(uint32_t volume) = 0; + virtual int setAudioVolume(uint32_t volume, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; protected: virtual ~IScreenCapturer2() {} diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoFrame.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoFrame.h index d2f6945c6..9b114002f 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoFrame.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoFrame.h @@ -186,6 +186,7 @@ OPTIONAL_ENUM_CLASS VideoFrameMetaDataType { kScreenMetaInfo, kVideoSourceType, kFaceInfo, + kFaceCaptureInfo, // Add other types afterwards }; @@ -211,7 +212,7 @@ class IVideoFrame : public RefCountInterface { virtual int getVideoFrameData(VideoFrameData& data) const = 0; /** - * Fill the underlying buffer with source buffer info contained in VideoFrameInfo + * Fill the underlying buffer with source buffer info contained in VideoFrameData * For frames of type "Type::kMemPixels", This function first tries to fill in-place with no copy and reallocation. * When it fails, a copy or copy-plus-reallocation may happen * @param data [in] Data to be filled in. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoMixerSource.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoMixerSource.h index f179cdac4..7cbe7183c 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoMixerSource.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoMixerSource.h @@ -51,7 +51,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int addVideoTrack(const char* id, agora_refptr track) = 0; + virtual int addVideoTrack(const char* id, agora_refptr track, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Remove the video track. * @param id The unique id of the stream. @@ -60,7 +60,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int removeVideoTrack(const char* id, agora_refptr track) = 0; + virtual int removeVideoTrack(const char* id, agora_refptr track, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Configures the layout of video frames comming from a specific track (indicated by uid) * on the mixer canvas. @@ -70,7 +70,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int setStreamLayout(const char* id, const MixerLayoutConfig& config) = 0; + virtual int setStreamLayout(const char* id, const MixerLayoutConfig& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Remove the user layout on the mixer canvas * @param id The unique id of the stream. @@ -79,7 +79,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int delStreamLayout(const char* id) = 0; + virtual int delStreamLayout(const char* id, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Add a image source to the mixer with its layout configuration on the mixer canvas. * @param id The unique id of the image. @@ -88,7 +88,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int addImageSource(const char* id, const MixerLayoutConfig& config, ImageType type = kPng) = 0; + virtual int addImageSource(const char* id, const MixerLayoutConfig& config, ImageType type = kPng, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Delete a image source to the mixer. * @param id The unique id of the image. @@ -96,18 +96,18 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int delImageSource(const char* id) = 0; + virtual int delImageSource(const char* id, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Clear all the layout settings set previously */ - virtual void clearLayout() = 0; + virtual int clearLayout(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Refresh the user layout on the mixer canvas * @return * 0 - Success * <0 - Failure */ - virtual int refresh() = 0; + virtual int refresh(ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set the mixer canvas background to override the default configuration * @param width width of the canvas @@ -118,7 +118,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int setBackground(uint32_t width, uint32_t height, int fps, uint32_t color_argb = 0) = 0; + virtual int setBackground(uint32_t width, uint32_t height, int fps, uint32_t color_argb = 0, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set the mixer canvas background to override the default configuration * @param width width of the canvas @@ -129,7 +129,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int setBackground(uint32_t width, uint32_t height, int fps, const char* url) = 0; + virtual int setBackground(uint32_t width, uint32_t height, int fps, const char* url, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set the rotation of the mixed video stream * @param rotation:0:none, 1:90掳, 2:180掳, 3:270掳 @@ -137,7 +137,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int setRotation(uint8_t rotation) = 0; + virtual int setRotation(uint8_t rotation, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Get the average delay in ms introduced by the mixer module, which includes the average * mixing delay plus the encoder delay. @@ -152,7 +152,7 @@ class IVideoMixerSource : public RefCountInterface { * 0 - Success * <0 - Failure */ - virtual int setMasterClockSource(const char* id = NULL) = 0; + virtual int setMasterClockSource(const char* id = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; }; } //namespace rtc diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoTrack.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoTrack.h index cb927ae9f..a79cb5e45 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoTrack.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/NGIAgoraVideoTrack.h @@ -9,6 +9,7 @@ #pragma once // NOLINT(build/header_guard) #include "AgoraBase.h" +#include #ifndef OPTIONAL_OVERRIDE #if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) @@ -55,7 +56,7 @@ class IVideoTrack : public RefCountInterface { */ virtual bool addVideoFilter( agora_refptr filter, media::base::VIDEO_MODULE_POSITION position = media::base::POSITION_POST_CAPTURER, - const char* id = NULL) = 0; + const char* id = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Removes the video filter added by `addVideoFilter` from the video track. @@ -69,7 +70,7 @@ class IVideoTrack : public RefCountInterface { */ virtual bool removeVideoFilter( agora_refptr filter, media::base::VIDEO_MODULE_POSITION position = media::base::POSITION_POST_CAPTURER, - const char* id = NULL) = 0; + const char* id = NULL, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Whether a video filter exists @@ -94,7 +95,7 @@ class IVideoTrack : public RefCountInterface { * - `true`: The video renderer is added successfully. * - `false`: The video renderer fails to be added. */ - virtual bool addRenderer(agora_refptr videoRenderer, media::base::VIDEO_MODULE_POSITION position = media::base::POSITION_PRE_RENDERER) = 0; + virtual bool addRenderer(agora_refptr videoRenderer, media::base::VIDEO_MODULE_POSITION position, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Removes the video renderer added by `addRenderer` from the video track. * @@ -104,7 +105,7 @@ class IVideoTrack : public RefCountInterface { * - `true`: The video renderer is removed successfully. * - `false`: The video renderer fails to be removed. */ - virtual bool removeRenderer(agora_refptr videoRenderer, media::base::VIDEO_MODULE_POSITION position = media::base::POSITION_PRE_RENDERER) = 0; + virtual bool removeRenderer(agora_refptr videoRenderer, media::base::VIDEO_MODULE_POSITION position, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Get the track type of the video track * @return @@ -120,7 +121,7 @@ class IVideoTrack : public RefCountInterface { * - 0: success * - <0: failure */ - virtual int enableVideoFilter(const char* id, bool enable) { return -1; } + virtual int enableVideoFilter(const char* id, bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) { return -1; } /** * set the properties of the specified video filter @@ -131,7 +132,7 @@ class IVideoTrack : public RefCountInterface { * - 0: success * - <0: failure */ - virtual int setFilterProperty(const char* id, const char* key, const char* json_value) { return -1; } + virtual int setFilterProperty(const char* id, const char* key, const char* json_value, ahpl_ref_t ares = AHPL_REF_INVALID) { return -1; } /** * get the properties of the specified video filter @@ -142,7 +143,7 @@ class IVideoTrack : public RefCountInterface { * - 0: success * - <0: failure */ - virtual int getFilterProperty(const char* id, const char* key, char* json_value, size_t buf_size) { return -1; } + virtual int getFilterProperty(const char* id, const char* key, char* json_value, size_t buf_size, ahpl_ref_t ares = AHPL_REF_INVALID) { return -1; } protected: ~IVideoTrack() {} @@ -234,6 +235,10 @@ struct LocalVideoTrackStats { int height; uint32_t encoder_type; uint32_t hw_encoder_accelerating; + /* + * encoder vender id, VideoCodecVenderId + */ + uint32_t encoder_vender_id; /** * The average time diff between frame captured and framed encoded. */ @@ -272,6 +277,8 @@ struct LocalVideoTrackStats { width(0), height(0), encoder_type(0), + hw_encoder_accelerating(0), + encoder_vender_id(0), uplink_cost_time_ms(0), quality_adapt_indication(ADAPT_NONE), txPacketLossRate(0), @@ -300,7 +307,7 @@ class ILocalVideoTrack : public IVideoTrack { * - `true`: Enable the local video track. * - `false`: Disable the local video track. */ - virtual void setEnabled(bool enable) = 0; + virtual int setEnabled(bool enable, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Sets the video encoder configuration. @@ -318,7 +325,7 @@ class ILocalVideoTrack : public IVideoTrack { * - 0: Success. * - < 0: Failure. */ - virtual int setVideoEncoderConfiguration(const VideoEncoderConfiguration& config) = 0; + virtual int setVideoEncoderConfiguration(const VideoEncoderConfiguration& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Set simulcast stream mode, enable, disable or auto enable @@ -329,7 +336,7 @@ class ILocalVideoTrack : public IVideoTrack { * - 0: Success. * - < 0: Failure. */ - virtual int setSimulcastStreamMode(SIMULCAST_STREAM_MODE mode, const SimulcastStreamConfig& config) = 0; + virtual int setSimulcastStreamMode(SIMULCAST_STREAM_MODE mode, const SimulcastStreamConfig& config, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Gets the state of the local video stream. @@ -446,13 +453,17 @@ struct RemoteVideoTrackStats { vqa avg cost ms */ int vqa_avg_cost_ms; + /** + decoder vender id, VideoCodecVenderId + */ + uint32_t decoder_vender_id; RemoteVideoTrackStats() : uid(0), delay(0), width(0), height(0), receivedBitrate(0), decoderOutputFrameRate(0), rendererOutputFrameRate(0), frameLossRate(0), packetLossRate(0), rxStreamType(VIDEO_STREAM_HIGH), totalFrozenTime(0), frozenRate(0), received_bytes(0), totalDecodedFrames(0), avSyncTimeMs(0), downlink_process_time_ms(0), frame_render_delay_ms(0), totalActiveTime(0), - publishDuration(0), vqa_mos(0), vqa_avg_cost_ms(0) {} + publishDuration(0), vqa_mos(0), vqa_avg_cost_ms(0), decoder_vender_id(0) {} }; /** @@ -493,7 +504,7 @@ class IRemoteVideoTrack : public IVideoTrack { * - 0: Success. * - < 0: Failure. */ - virtual int registerVideoEncodedFrameObserver(agora::media::IVideoEncodedFrameObserver* encodedObserver) = 0; + virtual int registerVideoEncodedFrameObserver(agora::media::IVideoEncodedFrameObserver* encodedObserver, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::media::IVideoEncodedFrameObserver "IVideoEncodedFrameObserver" object. * @param encodedObserver The pointer to the `IVideoEncodedFrameObserver` object. @@ -515,7 +526,7 @@ class IRemoteVideoTrack : public IVideoTrack { * - 0: Success. * - < 0: Failure. */ - virtual int registerMediaPacketReceiver(IMediaPacketReceiver* videoReceiver) = 0; + virtual int registerMediaPacketReceiver(IMediaPacketReceiver* videoReceiver, ahpl_ref_t ares = AHPL_REF_INVALID) = 0; /** * Releases the \ref agora::rtc::IMediaPacketReceiver "IMediaPacketReceiver" object. * @param videoReceiver The pointer to the `IMediaPacketReceiver` object. diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ares.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ares.h new file mode 100644 index 000000000..92c409bb3 --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ares.h @@ -0,0 +1,77 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : May 30th, 2023 + * Module: AHPL async result object header file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 ~ 2023 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_ARES_H__ +#define __AHPL_ARES_H__ + +#include +#include +#include +#include + +#ifdef __cplusplus +extern "C" { +#endif + + +/** + * Create an async result object. + * Parameter: + * arg: the parameter attached with the async result object; + * Return value: + * the async result object ref id just created, AHPL_REF_INVALID when failed. + **/ +extern __ahpl_api__ ahpl_ref_t ahpl_ares_create (void *arg); + +/** + * Complete the specified async result object. + * Parameters: + * ref: the async result object ref id; + * result: a result value which can be retrieved by wait function; + * Return value: + * <0: error occured, and ahpl_errno indicates which error; + * >=0: successful; + **/ +extern __ahpl_api__ int ahpl_ares_complete (ahpl_ref_t ref, intptr_t result); + +/** + * Wait the specified async result object to complete. + * Parameters: + * ref: the async result object ref id; + * timeo: maximum waiting time in milliseconds; + * result: variable address for the value which was set by complete function, + * NOTE: the *result only will be set when the return value of wait + * function is AHPL_POLL_ST_SIGNALED and result != NULL, if you + * do not care the complete result, just passing NULL to it; + * Return value: + * <0: error occured, and ahpl_errno indicates which error; + * >=0: AHPL_POLL_ST_* macros value; + **/ +extern __ahpl_api__ int ahpl_ares_wait (ahpl_ref_t ref, intptr_t timeo, intptr_t *result); + +/** + * Reset the specified async result object to non signaled state. + * Parameters: + * ref: the async result object ref id + * Return value: + * <0: error occured, and ahpl_errno indicates which error; + * >=0: successful; + **/ +extern __ahpl_api__ int ahpl_ares_reset (ahpl_ref_t ref); + + + +#ifdef __cplusplus +} +#endif + +#endif /* __AHPL_ARES_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_defs.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_defs.h new file mode 100644 index 000000000..3f1cb6a02 --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_defs.h @@ -0,0 +1,171 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : Jul 21st, 2018 + * Module: AHPL common definitions header file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_DEFS_H__ +#define __AHPL_DEFS_H__ + + +#define ahpl_stringify_1(x) #x +#define ahpl_stringify(x) ahpl_stringify_1(x) + + +#ifndef __MAKERCORE_ASSEMBLY__ + +#ifdef __cplusplus +extern "C" { +#endif + + +#ifdef _MSC_VER +#ifndef __inline__ +#define __inline__ __inline +#endif +#endif + + +#ifndef container_of +#if defined (__GNUC__) +#define container_of(ptr, type, member) ({ \ + const typeof( ((type *)0)->member ) *__mptr = (void *)(ptr); \ + (type *)( (char *)__mptr - offsetof(type,member) );}) +#else +#define container_of(ptr, type, member) ((type *)((char *)(ptr) - offsetof(type,member))) +#endif +#endif + +#define ahpl_rela_addr(type, field) (&((type *)0)->field) +#define ahpl_base_addr(ptr, type, field) \ + ((type *)((uintptr_t)(ptr) - (uintptr_t)(&((type *)0)->field))) + + +#define ahpl_min(x, y) ((x) < (y) ? (x) : (y)) +#define ahpl_max(x, y) ((x) > (y) ? (x) : (y)) + + +/* I think 64 args is big enough */ +#define AHPL_VAR_ARGS_MAX 64 + + +#ifdef BUILD_TARGET_SHARED +#if defined (__GNUC__) +#define __export_in_so__ __attribute__ ((visibility ("default"))) +#elif defined (_MSC_VER) +#define __export_in_so__ __declspec (dllexport) +#endif +#else +#define __export_in_so__ +#endif + + +#ifndef __ahpl_api__ +#if defined (_MSC_VER) && defined (BUILDING_API_IMPL_SOURCE) && defined (BUILD_TARGET_SHARED) +#define __ahpl_api__ __declspec (dllexport) +#elif defined (_MSC_VER) && !defined (BUILDING_API_IMPL_SOURCE) +#define __ahpl_api__ __declspec (dllimport) +#else +#define __ahpl_api__ +#endif +#endif + +#if defined (BUILDING_API_IMPL_SOURCE) || defined (STATIC_LINKING_AHPL) + +#if defined (__GNUC__) +#define __so_api__ __attribute__ ((visibility ("default"))) +#elif defined (_MSC_VER) +#define __so_api__ __declspec (dllexport) +#else +#define __so_api__ +#endif + +#else + +#if defined (_MSC_VER) +#define __so_api__ __declspec (dllimport) +#else +#define __so_api__ +#endif + +#endif + + +#ifdef __GNUC__ +#ifndef __MACH__ +#define AHPL_DEFINE_BIN(v, f) \ +__asm__ (".section .rodata\n\t" \ + ".globl "#v"_bin_begin\n\t" \ + ".hidden "#v"_bin_begin\n\t" \ + ".align 4\n\t" \ + #v"_bin_begin:\n\t" \ + ".incbin \"" ahpl_stringify(MAKERCORE_THIS_FILE_DIR/f)"\"\n\t" \ + ".globl "#v"_bin_end\n\t" \ + ".hidden "#v"_bin_end\n\t" \ + #v"_bin_end:\n\t" \ + ".previous\n\t") +#else +#define AHPL_DEFINE_BIN(v, f) \ +__asm__ (".section __TEXT,__const\n\t" \ + ".globl _"#v"_bin_begin\n\t" \ + ".private_extern _"#v"_bin_begin\n\t" \ + ".align 4\n\t" \ + "_"#v"_bin_begin:\n\t" \ + ".incbin \"" ahpl_stringify (MAKERCORE_THIS_FILE_DIR/f) "\"\n\t" \ + ".globl _"#v"_bin_end\n\t" \ + ".private_extern _"#v"_bin_end\n\t" \ + "_"#v"_bin_end:\n\t" \ + ".previous\n\t") +#endif + +#define AHPL_DECLARE_BIN(v) extern unsigned char v##_bin_begin, v##_bin_end +#define AHPL_BIN_ADDR(v) ((void *)&v##_bin_begin) +#define AHPL_BIN_SIZE(v) ((size_t)((unsigned char *)&v##_bin_end - (unsigned char *)&v##_bin_begin)) +#endif + + +#ifdef __cplusplus +} +#endif + +#else /* __MAKERCORE_ASSEMBLY__ */ + +#ifdef __GNUC__ +#ifndef __MACH__ +.macro AHPL_DEFINE_BIN_S v, f + .section .rodata + .globl \v\()_bin_begin + .hidden \v\()_bin_begin + .align 4 + \v\()_bin_begin: + .incbin ahpl_stringify (MAKERCORE_THIS_FILE_DIR/\f) + .globl \v\()_bin_end + .hidden \v\()_bin_end + \v\()_bin_end: + .previous +.endm +#else +.macro AHPL_DEFINE_BIN_S v, f + .section __TEXT,__const + .globl _\()\v\()_bin_begin + .private_extern _\()\v\()_bin_begin + .align 4 + _\()\v\()_bin_begin: + .incbin ahpl_stringify (MAKERCORE_THIS_FILE_DIR/\f) + .globl _\()\v\()_bin_end + .private_extern _\()\v\()_bin_end + _\()\v\()_bin_end: + .previous +.endm +#endif +#endif + +#endif /* __MAKERCORE_ASSEMBLY__ */ + +#endif /* __AHPL_DEFS_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_poll.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_poll.h new file mode 100644 index 000000000..d8d1e7785 --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_poll.h @@ -0,0 +1,51 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : May 30th, 2023 + * Module: AHPL poll functionality definition header file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 ~ 2023 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_POLL_H__ +#define __AHPL_POLL_H__ + +#include +#include +#include + +#ifdef __cplusplus +extern "C" { +#endif + + + +#define AHPL_POLL_ST_NONE 0 +#define AHPL_POLL_ST_SIGNALED 1 +#define AHPL_POLL_ST_DESTROY 2 + +/** + * Poll the objects specified in refs, return their states. + * Parameters: + * refs: the object refs array for input, and the signaled refs array for output; + * count: the number of items for input refs array; + * min: the minimum number of signaled refs for triggering waken up. + * NOTE: if any one of the refs encounters error or destroy, + * then wake up immediately regardless min parameter. + * timeo: maximum waiting time in milliseconds; + * Return value: + * <0: error occured, and ahpl_errno indicates which error; + * >=0: the signaled refs count before timeout; + **/ +extern __ahpl_api__ ssize_t ahpl_poll (ahpl_ref_t refs [], size_t count, size_t min, intptr_t timeo); + + + +#ifdef __cplusplus +} +#endif + +#endif /* __AHPL_POLL_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ref.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ref.h new file mode 100644 index 000000000..b5264468b --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_ref.h @@ -0,0 +1,230 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : Nov 19th, 2018 + * Module: AHPL reference object definition file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_REF_H__ +#define __AHPL_REF_H__ + + +#include +#include + + +#ifdef __cplusplus +extern "C" { +#endif + + + +typedef struct _internal_ref_od_ *ahpl_ref_t; + +#define AHPL_REF_INVALID ((ahpl_ref_t)(intptr_t)-1) + +#define ahpl_ref_invalid(ref) (((int)(intptr_t)(ref)) < 0) + + +/** + * The reference object destructor function prototype, which invoked when application + * calling ahpl_ref_destroy functions to release resources. + * Parameter: + * arg: the parameter passed in when creating the reference object; + * Return value: + * none. + **/ +typedef void (*ahpl_ref_dtor_t) (void *arg); + +/** + * The reference object creating function prototype, which is used to create a ref object. + * Parameters: + * arg: the parameter attached with the reference object; + * dtor: the ref object destructor function, which will be invoked when + * the ref object is deleted; + * caller_free: + * none-0 guarantee the ref object relatives must be freed in the caller thread + * 0 the ref object relatives could be freed in any thread + * Return value: + * the ref object id, please use ahpl_ref_invalid macro to check whether failed. + **/ +extern __ahpl_api__ ahpl_ref_t ahpl_ref_create (void *arg, ahpl_ref_dtor_t dtor, int caller_free); + + +/** + * The ref object callback function prototype. + * Parameter: + * arg: the ref object argument which was passed in when creating; + * argc: specify the argv array elements count, the same as the argc + * when invoking ahpl_ref_[get|read|write] functions; + * argv: array for passing variable args, the same as the args + * when invoking ahpl_task_exec_* functions; + * Return value: + * none. + **/ +typedef void (*ahpl_ref_func_t) (void *arg, uintptr_t argc, uintptr_t argv []); + +/** + * Hold the ref object, and invoke the specified callback function. + * Parameter: + * ref: the ref object id; + * f: the callback function; + * argc: the args count + * ...: variable args + * Return value: + * 0: success + * <0: failure with ahpl_errno set + **/ +extern __ahpl_api__ int ahpl_ref_hold (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...); +extern __ahpl_api__ int ahpl_ref_hold_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args); +extern __ahpl_api__ int ahpl_ref_hold_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []); + +/** + * Hold the ref object and read lock it, then invoke the specified callback function. + * Parameter: + * ref: the ref object id; + * f: the callback function; + * argc: the args count + * ...: variable args + * Return value: + * 0: success + * <0: failure with ahpl_errno set + **/ +extern __ahpl_api__ int ahpl_ref_read (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...); +extern __ahpl_api__ int ahpl_ref_read_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args); +extern __ahpl_api__ int ahpl_ref_read_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []); + +/** + * Hold the ref object and write lock it, then invoke the specified callback function. + * Parameter: + * ref: the ref object id; + * f: the callback function; + * argc: the args count + * ...: variable args + * Return value: + * 0: success + * <0: failure with ahpl_errno set + **/ +extern __ahpl_api__ int ahpl_ref_write (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...); +extern __ahpl_api__ int ahpl_ref_write_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args); +extern __ahpl_api__ int ahpl_ref_write_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []); + +/** + * Hold the ref object and set it maystall, then invoke the specified callback function. + * Parameter: + * ref: the ref object id; + * f: the callback function; + * argc: the args count + * ...: variable args + * Return value: + * 0: success + * <0: failure with ahpl_errno set + **/ +extern __ahpl_api__ int ahpl_ref_maystall (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...); +extern __ahpl_api__ int ahpl_ref_maystall_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args); +extern __ahpl_api__ int ahpl_ref_maystall_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []); + + +typedef void *ahpl_refobj_t; + +#define AHPL_FREE_ONLY_OBJ ((ahpl_refobj_t)(uintptr_t)1) +#define ahpl_is_free_only(robj) ((int)((ahpl_refobj_t)(robj) == AHPL_FREE_ONLY_OBJ)) + +/** + * Retrieve the ref object arg. + * Parameters: + * robj: the reference object; + * Return value: + * the ref object arg; + **/ +extern __ahpl_api__ void *ahpl_refobj_arg (ahpl_refobj_t robj); + +/** + * Get the ref id of the specified ref object. + * Parameters: + * robj: the reference object; + * Return value: + * the ref id. + **/ +extern __ahpl_api__ ahpl_ref_t ahpl_refobj_id (ahpl_refobj_t robj); + +/** + * Make sure read lock the ref object specified by robj, then invoke the specified callback function. + * Parameter: + * robj: the ref object itself; + * f: the callback function; + * argc: the args count + * ...: variable args + * Return value: + * 0: success + * <0: failure with ahpl_errno set + **/ +extern __ahpl_api__ int ahpl_refobj_read (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, ...); +extern __ahpl_api__ int ahpl_refobj_read_args (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, va_list args); +extern __ahpl_api__ int ahpl_refobj_read_argv (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []); + +/** + * Make sure set the ref object specified by robj maystall, then invoke the specified callback function. + * Parameter: + * robj: the ref object itself; + * f: the callback function; + * argc: the args count + * ...: variable args + * Return value: + * 0: success + * <0: failure with ahpl_errno set + **/ +extern __ahpl_api__ int ahpl_refobj_maystall (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, ...); +extern __ahpl_api__ int ahpl_refobj_maystall_args (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, va_list args); +extern __ahpl_api__ int ahpl_refobj_maystall_argv (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []); + + +/** + * Detect whether the reference object specified by ref is read locked + * by the calling thread. + * Parameter: + * ref: the reference object id + * Return value: + * 0: not read locked + * none zero: read locked by calling thread + **/ +extern __ahpl_api__ int ahpl_ref_locked (ahpl_ref_t ref); + +/** + * Set the living scope ref object of the specified ref object. + * Parameters: + * ref: the ref object ref id; + * scope_ref: the living scope ref, the ref object will be destroyed + * when the object specified by scope_ref was destroyed; + * Return value: + * <0: error occured, and ahpl_errno indicates which error; + * >=0: successful; + **/ +extern __ahpl_api__ int ahpl_ref_set_scope (ahpl_ref_t ref, ahpl_ref_t scope_ref); + +/** + * Destroy the reference object specified by ref. + * Parameter: + * ref: the reference object id + * do_delete: 0 for just marking it destroyed + * non-0 value for deleting it + * Return value: + * 0: success + * <0: failed, and ahpl_errno indicates what error occurs + **/ +extern __ahpl_api__ int ahpl_ref_destroy (ahpl_ref_t ref, int do_delete); + + + +#ifdef __cplusplus +} +#endif + + + +#endif /* __AHPL_REF_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_types.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_types.h new file mode 100644 index 000000000..0617d3326 --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/ahpl_types.h @@ -0,0 +1,96 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : Jul 27th, 2020 + * Module: AHPL POSIX definitions header file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2020 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_TYPES_H__ +#define __AHPL_TYPES_H__ + +#include +#include +#include +#include +#if defined (__linux__) || defined (__MACH__) || defined (__kliteos2__) || defined (_WIN32) +#include +#endif + +#include + +#ifdef __cplusplus +extern "C" { +#endif + + +#if !defined (__linux__) && !defined (__MACH__) +/** + * Worry about some guy would like to define a macro + * for this type, so confirm that it is not a macro. + * -- Lionfore Hao Nov 5th, 2018 + **/ +#ifndef __ssize_t_defined +typedef intptr_t ssize_t; +#define __ssize_t_defined +#endif +#endif + + +/* The AHPL timestamp type */ +typedef unsigned long long ahpl_ts_t; + +/* The proto for a general ahpl var args function with argc & argv. */ +typedef void (*ahpl_argv_f) (uintptr_t argc, uintptr_t argv []); + +/* The proto for a general ahpl object destructor function. */ +typedef ahpl_argv_f ahpl_obj_dtor_t; + + +#if !defined (_WIN32) && !defined (__kspreadtrum__) +typedef int ahpl_fd_t; +#define AHPL_INVALID_FD ((ahpl_fd_t)-1) + +static __inline__ int ahpl_fd_invalid (ahpl_fd_t fd) +{ + return (int)(fd < 0); +} +#else +#if defined (_WIN32) +/** + * We MUST include 'winsock2.h' before any occurrence + * of including 'windows.h', the fucking Windows has + * the fucking issue that many definitions would be + * complained redefinition if not so. + * -- Lionfore Hao Sep 25th, 2018 + **/ +#include +#include + +typedef HANDLE ahpl_fd_t; +#define AHPL_INVALID_FD ((ahpl_fd_t)INVALID_HANDLE_VALUE) +#elif defined (__kspreadtrum__) +#include +#include + +typedef TCPIP_SOCKET_T ahpl_fd_t; +#define AHPL_INVALID_FD ((ahpl_fd_t)-1) +#endif + +static __inline__ int ahpl_fd_invalid (ahpl_fd_t fd) +{ + return (int)(fd == AHPL_INVALID_FD); +} +#endif + + +#ifdef __cplusplus +} +#endif + + +#endif /* __AHPL_TYPES_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ares_class.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ares_class.h new file mode 100644 index 000000000..8643c0b78 --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ares_class.h @@ -0,0 +1,91 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : Jun 23rd, 2023 + * Module: AHPL async result object for C++ definition file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 ~ 2023 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_ARES_CLASS_H__ +#define __AHPL_ARES_CLASS_H__ + + +#include + +#include +#include +#include +#include +#include + + +class ahpl_ares_class: public ahpl_ref_class { +public: + ahpl_ares_class (): ahpl_ref_class (ahpl_ares_create (this)) + { + if (ahpl_ref_invalid (ref ())) + abort (); + } + + /** + * The destructor of this class is very different with + * base class and other derivatives, destroy the ref + * in the destructor and the destructor is public. + **/ + virtual ~ahpl_ares_class () + { + ahpl_ref_t refid = ref (); + if (!ahpl_ref_invalid (refid)) + ahpl_ref_destroy (refid, true); + } + + /* complete the async result */ + int complete (intptr_t result = 0) + { + return ahpl_ares_complete (ref (), result); + } + + /* wait the async result to be completed */ + int wait (intptr_t timeo, intptr_t *result = NULL) + { + return ahpl_ares_wait (ref (), timeo, result); + } + + /* reset the signaled state */ + int reset (void) + { + return ahpl_ares_reset (ref ()); + } + + operator ahpl_ref_t () const + { + return ref (); + } + +private: + /* we do not allow invoke the destroy function of base class */ + int destroy (bool do_delete = true) + { + abort (); + return 0; + } + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +private: + ahpl_ares_class (const ahpl_ares_class &) = delete; + ahpl_ares_class (ahpl_ares_class &&) = delete; + ahpl_ares_class &operator = (const ahpl_ares_class &) = delete; + ahpl_ares_class &operator = (ahpl_ares_class &&) = delete; +#else +private: + ahpl_ares_class (const ahpl_ares_class &); + ahpl_ares_class &operator = (const ahpl_ares_class &); +#endif /* C++11 */ +}; + + +#endif /* __AHPL_ARES_CLASS_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_poll_class.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_poll_class.h new file mode 100644 index 000000000..2253b5d34 --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_poll_class.h @@ -0,0 +1,129 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : Jun 23rd, 2023 + * Module: AHPL poll functionality for C++ definition file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 ~ 2023 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_POLL_CLASS_H__ +#define __AHPL_POLL_CLASS_H__ + + +#include +#include +#include +#include +#include + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +#include +#include +#endif + +#include +#include + + +class ahpl_poll_class { +private: + std::map poll_refs; + std::vector signaled_refs; + +public: + int add (const ahpl_ares_class &tail) + { + poll_refs [tail.ref ()] = &tail; + return 0; + } + +#if (__cplusplus >= 201103) || defined (_MSC_VER) + template + int add (const T &head, const Targs&... rest) + { + poll_refs [head.ref ()] = &head; + return add (rest...); + } + + /* constructor with variable args */ + template + ahpl_poll_class (Targs&... args) + { + add (args...); + } +#endif /* C++11 */ + + /* poll the constructed async results */ + int poll (size_t min, intptr_t timeo) + { + ahpl_ref_t local_refs [32]; + ahpl_ref_t *refs = local_refs; + size_t count = poll_refs.size (); + std::map::iterator it; + int i; + int err; + + if (count > sizeof local_refs / sizeof local_refs [0]) { + refs = new ahpl_ref_t [count]; + if (refs == NULL) + return -1; + } + + i = 0; + for (it = poll_refs.begin (); it != poll_refs.end (); it++) + refs [i++] = it->first; + + err = ahpl_poll (refs, count, min, timeo); + signaled_refs.clear (); + for (i = 0; i < err; i++) { + it = poll_refs.find (refs [i]); + if (it != poll_refs.end ()) + signaled_refs.push_back (it->second); + } + + if (refs != local_refs) + delete [] refs; + + return err; + } + + /* total async results count */ + size_t total () + { + return poll_refs.size (); + } + + /* signaled async results count */ + size_t signaled () + { + return signaled_refs.size (); + } + + /* operator for accessing the signaled async results */ + const ahpl_ares_class *operator [] (size_t idx) + { + if (idx < signaled_refs.size ()) + return signaled_refs [idx]; + + return NULL; + } + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +private: + ahpl_poll_class (const ahpl_poll_class &) = delete; + ahpl_poll_class (ahpl_poll_class &&) = delete; + ahpl_poll_class &operator = (const ahpl_poll_class &) = delete; + ahpl_poll_class &operator = (ahpl_poll_class &&) = delete; +#else +private: + ahpl_poll_class (const ahpl_poll_class &); + ahpl_poll_class &operator = (const ahpl_poll_class &); +#endif +}; + + +#endif /* __AHPL_POLL_CLASS_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ref_class.h b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ref_class.h new file mode 100644 index 000000000..85958718e --- /dev/null +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/AgoraRtcKit/api/cpp/ahpl_ref_class.h @@ -0,0 +1,1002 @@ +/************************************************************* + * Author: Lionfore Hao (haolianfu@agora.io) + * Date : Nov 19th, 2018 + * Module: AHPL reference object for C++ definition file + * + * + * This is a part of the Advanced High Performance Library. + * Copyright (C) 2018 Agora IO + * All rights reserved. + * + *************************************************************/ + +#ifndef __AHPL_REF_OBJ_CPP_H__ +#define __AHPL_REF_OBJ_CPP_H__ + + +#include + +#include +#include +#include + +#ifdef COMPILING_WITH_MPQ_H +#include +#ifdef COMPILING_WITH_MPQP_H +#include +#endif +#endif + +#ifdef COMPILING_WITH_ASYNC_H +#include +#endif + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +#include +#include +#endif + +class ahpl_ref_class { +private: + ahpl_ref_t ref_id; + +public: + ahpl_ref_class (bool caller_free = true) + { + ref_id = ahpl_ref_create (this, __dtor, (int)caller_free); + if (ahpl_ref_invalid (ref_id)) + abort (); + } + + ahpl_ref_class (ahpl_ref_t ref) + { + ref_id = ref; + } + + ahpl_ref_t ref () const + { + return ref_id; + } + + int hold (ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_hold_args (ref (), f, argc, args); + va_end (args); + + return err; + } + + int hold_args (ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_hold_args (ref (), f, argc, args); + } + + int hold_argv (ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_hold_argv (ref (), f, argc, argv); + } + + int read (ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_read_args (ref (), f, argc, args); + va_end (args); + + return err; + } + + int read_args (ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_read_args (ref (), f, argc, args); + } + + int read_argv (ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_read_argv (ref (), f, argc, argv); + } + + int write (ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_write_args (ref (), f, argc, args); + va_end (args); + + return err; + } + + int write_args (ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_write_args (ref (), f, argc, args); + } + + int write_argv (ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_write_argv (ref (), f, argc, argv); + } + + int maystall (ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_maystall_args (ref (), f, argc, args); + va_end (args); + + return err; + } + + int maystall_args (ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_maystall_args (ref (), f, argc, args); + } + + int maystall_argv (ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_maystall_argv (ref (), f, argc, argv); + } + + /* The static version of member functions */ + static int hold (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_hold_args (ref, f, argc, args); + va_end (args); + + return err; + } + + static int hold_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_hold_args (ref, f, argc, args); + } + + static int hold_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_hold_argv (ref, f, argc, argv); + } + + static int read (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_read_args (ref, f, argc, args); + va_end (args); + + return err; + } + + static int read_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_read_args (ref, f, argc, args); + } + + static int read_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_read_argv (ref, f, argc, argv); + } + + static int write (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_write_args (ref, f, argc, args); + va_end (args); + + return err; + } + + static int write_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_write_args (ref, f, argc, args); + } + + static int write_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_write_argv (ref, f, argc, argv); + } + + static int maystall (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_ref_maystall_args (ref, f, argc, args); + va_end (args); + + return err; + } + + static int maystall_args (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_ref_maystall_args (ref, f, argc, args); + } + + static int maystall_argv (ahpl_ref_t ref, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_ref_maystall_argv (ref, f, argc, argv); + } + + static int read (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_refobj_read_args (robj, f, argc, args); + va_end (args); + + return err; + } + + static int read_args (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_refobj_read_args (robj, f, argc, args); + } + + static int read_argv (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_refobj_read_argv (robj, f, argc, argv); + } + + static int maystall (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_refobj_maystall_args (robj, f, argc, args); + va_end (args); + + return err; + } + + static int maystall_args (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, va_list args) + { + return ahpl_refobj_maystall_args (robj, f, argc, args); + } + + static int maystall_argv (ahpl_refobj_t robj, ahpl_ref_func_t f, uintptr_t argc, uintptr_t argv []) + { + return ahpl_refobj_maystall_argv (robj, f, argc, argv); + } + + static ahpl_ref_class *from_refobj (ahpl_refobj_t robj) + { + return (ahpl_ref_class *)ahpl_refobj_arg (robj); + } + + /* set the living scope ref object of this ref object */ + int set_scope (ahpl_ref_t scope_ref) + { + return ahpl_ref_set_scope (ref (), scope_ref); + } + + int destroy (bool do_delete = true) + { + if (!ahpl_ref_invalid (ref_id)) + return ahpl_ref_destroy (ref_id, (int)do_delete); + + if (do_delete) + delete this; + + return 0; + } + +public: + class deleter { + public: + void operator () (ahpl_ref_class *obj_ptr) const + { + if (obj_ptr != NULL) + obj_ptr->destroy (); + } + }; + +protected: + /* We do not allow delete this object directly. */ + virtual ~ahpl_ref_class () + { + } + +private: + static void __dtor (void *arg) + { + ahpl_ref_class *__this = (ahpl_ref_class *)arg; + ::delete __this; + } + +#ifdef __AHPL_MPQ_H__ + /* MPQ relative encapsulations */ +public: + int queue (ahpl_mpq_t tq, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_mpq_queue_args (tq, dq, ref (), f_name, f, argc, args); + va_end (args); + + return err; + } + + int queue_args (ahpl_mpq_t tq, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpq_queue_args (tq, dq, ref (), f_name, f, argc, args); + } + + int queue_argv (ahpl_mpq_t tq, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpq_queue_argv (tq, dq, ref (), f_name, f, argc, argv); + } + + int queue_data (ahpl_mpq_t tq, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_data_t f, size_t len, void *data) + { + return ahpl_mpq_queue_data (tq, dq, ref (), f_name, f, len, data); + } + + int call (ahpl_mpq_t q, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_mpq_call_args (q, ref (), f_name, f, argc, args); + va_end (args); + + return err; + } + + int call_args (ahpl_mpq_t q, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpq_call_args (q, ref (), f_name, f, argc, args); + } + + int call_argv (ahpl_mpq_t q, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpq_call_argv (q, ref (), f_name, f, argc, argv); + } + + int call_data (ahpl_mpq_t q, const char *f_name, ahpl_mpq_func_data_t f, size_t len, void *data) + { + return ahpl_mpq_call_data (q, ref (), f_name, f, len, data); + } + + int run (ahpl_mpq_t q, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_mpq_run_args (q, dq, ref (), f_name, f, argc, args); + va_end (args); + + return err; + } + + int run_args (ahpl_mpq_t q, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpq_run_args (q, dq, ref (), f_name, f, argc, args); + } + + int run_argv (ahpl_mpq_t q, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpq_run_argv (q, dq, ref (), f_name, f, argc, argv); + } + + int run_data (ahpl_mpq_t q, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_data_t f, size_t len, void *data) + { + return ahpl_mpq_run_data (q, dq, ref (), f_name, f, len, data); + } + +#ifdef __AHPL_MPQP_H__ + /* MPQP relative encapsulations */ + ahpl_mpq_t queue (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + ahpl_mpq_t qid; + + va_start (args, argc); + qid = ahpl_mpqp_queue_args (qp, dq, ref (), f_name, f, argc, args); + va_end (args); + + return qid; + } + + ahpl_mpq_t queue_args (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpqp_queue_args (qp, dq, ref (), f_name, f, argc, args); + } + + ahpl_mpq_t queue_argv (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpqp_queue_argv (qp, dq, ref (), f_name, f, argc, argv); + } + + ahpl_mpq_t queue_data (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_data_t f, size_t len, void *data) + { + return ahpl_mpqp_queue_data (qp, dq, ref (), f_name, f, len, data); + } + + ahpl_mpq_t call (ahpl_mpqp_t qp, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + ahpl_mpq_t qid; + + va_start (args, argc); + qid = ahpl_mpqp_call_args (qp, ref (), f_name, f, argc, args); + va_end (args); + + return qid; + } + + ahpl_mpq_t call_args (ahpl_mpqp_t qp, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpqp_call_args (qp, ref (), f_name, f, argc, args); + } + + ahpl_mpq_t call_argv (ahpl_mpqp_t qp, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpqp_call_argv (qp, ref (), f_name, f, argc, argv); + } + + ahpl_mpq_t call_data (ahpl_mpqp_t qp, const char *f_name, ahpl_mpq_func_data_t f, size_t len, void *data) + { + return ahpl_mpqp_call_data (qp, ref (), f_name, f, len, data); + } + + ahpl_mpq_t run (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + ahpl_mpq_t qid; + + va_start (args, argc); + qid = ahpl_mpqp_run_args (qp, dq, ref (), f_name, f, argc, args); + va_end (args); + + return qid; + } + + ahpl_mpq_t run_args (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpqp_run_args (qp, dq, ref (), f_name, f, argc, args); + } + + ahpl_mpq_t run_argv (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpqp_run_argv (qp, dq, ref (), f_name, f, argc, argv); + } + + ahpl_mpq_t run_data (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_data_t f, size_t len, void *data) + { + return ahpl_mpqp_run_data (qp, dq, ref (), f_name, f, len, data); + } + + int pool_tail_queue (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, ...) + { + va_list args; + int err; + + va_start (args, argc); + err = ahpl_mpqp_pool_tail_queue_args (qp, dq, ref (), f_name, f, argc, args); + va_end (args); + + return err; + } + + int pool_tail_queue_args (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, va_list args) + { + return ahpl_mpqp_pool_tail_queue_args (qp, dq, ref (), f_name, f, argc, args); + } + + int pool_tail_queue_argv (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_mpq_func_argv_t f, uintptr_t argc, uintptr_t *argv) + { + return ahpl_mpqp_pool_tail_queue_argv (qp, dq, ref (), f_name, f, argc, argv); + } +#endif /* __AHPL_MPQP_H__ */ +#endif /* __AHPL_MPQ_H__ */ + + /* C++11 lambda encapsulations */ +#if (__cplusplus >= 201103) || defined (_MSC_VER) +public: + typedef std::function ahpl_ref_lambda_f; + + int hold (ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::hold (____ref_f, 1, &lambda_obj); + } + + int read (ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::read (____ref_f, 1, &lambda_obj); + } + + int write (ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::write (____ref_f, 1, &lambda_obj); + } + + int maystall (ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::maystall (____ref_f, 1, &lambda_obj); + } + + static int hold (ahpl_ref_t ref, ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::hold (ref, ____ref_f, 1, &lambda_obj); + } + + static int read (ahpl_ref_t ref, ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::read (ref, ____ref_f, 1, &lambda_obj); + } + + static int write (ahpl_ref_t ref, ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::write (ref, ____ref_f, 1, &lambda_obj); + } + + static int maystall (ahpl_ref_t ref, ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::maystall (ref, ____ref_f, 1, &lambda_obj); + } + + static int read (ahpl_refobj_t robj, ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::read (robj, ____ref_f, 1, &lambda_obj); + } + + static int maystall (ahpl_refobj_t robj, ahpl_ref_lambda_f &&lambda_f) + { + ahpl_ref_lambda_f lambda_obj (std::move (lambda_f)); + return ahpl_ref_class::maystall (robj, ____ref_f, 1, &lambda_obj); + } + +private: + static void ____ref_f (void *arg, uintptr_t argc, uintptr_t argv []) + { + ahpl_ref_lambda_f *lambda_obj = reinterpret_cast(argv [0]); + (*lambda_obj) ((ahpl_ref_class *)arg); + } + +#ifdef __AHPL_MPQ_H__ +public: + typedef std::function ahpl_ref_mpq_lambda_f; + + /* MPQ encapsulations */ + int queue (ahpl_mpq_t tq, ahpl_mpq_t dq, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_ref_class::queue (tq, dq, f_name, ____mpq_f, 1, task_obj); + if (err < 0) + delete task_obj; + + return err; + } + + int call (ahpl_mpq_t q, const char *f_name, ahpl_ref_mpq_lambda_f&& task, void *task_result = NULL) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_ref_class::call (q, f_name, ____mpq_f, 2, task_obj, task_result); + if (err < 0) + delete task_obj; + + return err; + } + + int run (ahpl_mpq_t q, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_ref_class::run (q, AHPL_MPQ_INVALID, f_name, ____mpq_f, 1, task_obj); + if (err < 0) + delete task_obj; + + return err; + } + +#ifdef __AHPL_MPQP_H__ + /* MPQP encapsulations */ + ahpl_mpq_t queue (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + ahpl_mpq_t qid = ahpl_ref_class::queue (qp, dq, f_name, ____mpq_f, 1, task_obj); + if (ahpl_mpq_invalid (qid)) + delete task_obj; + + return qid; + } + + ahpl_mpq_t call (ahpl_mpqp_t qp, const char *f_name, ahpl_ref_mpq_lambda_f&& task, void *task_result = NULL) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + ahpl_mpq_t qid = ahpl_ref_class::call (qp, f_name, ____mpq_f, 2, task_obj, task_result); + if (ahpl_mpq_invalid (qid)) + delete task_obj; + + return qid; + } + + ahpl_mpq_t run (ahpl_mpqp_t qp, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + ahpl_mpq_t qid = ahpl_ref_class::run (qp, AHPL_MPQ_INVALID, f_name, ____mpq_f, 1, task_obj); + if (ahpl_mpq_invalid (qid)) + delete task_obj; + + return qid; + } + + int pool_tail_queue (ahpl_mpqp_t qp, ahpl_mpq_t dq, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_ref_class::pool_tail_queue (qp, dq, f_name, ____mpq_f, 1, task_obj); + if (err < 0) + delete task_obj; + + return err; + } +#endif /* __AHPL_MPQP_H__ */ + + /* MPQ with specified ref encapsulations */ + static int queue (ahpl_mpq_t tq, ahpl_mpq_t dq, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_mpq_queue (tq, dq, ref, f_name, ____mpq_f, 1, task_obj); + if (err < 0) + delete task_obj; + + return err; + } + + static int call (ahpl_mpq_t q, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task, void *task_result = NULL) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_mpq_call (q, ref, f_name, ____mpq_f, 2, task_obj, task_result); + if (err < 0) + delete task_obj; + + return err; + } + + static int run (ahpl_mpq_t q, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_mpq_run (q, AHPL_MPQ_INVALID, ref, f_name, ____mpq_f, 1, task_obj); + if (err < 0) + delete task_obj; + + return err; + } + +#ifdef __AHPL_MPQP_H__ + /* MPQP with specified ref encapsulations */ + static ahpl_mpq_t queue (ahpl_mpqp_t qp, ahpl_mpq_t dq, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + ahpl_mpq_t qid = ahpl_mpqp_queue (qp, dq, ref, f_name, ____mpq_f, 1, task_obj); + if (ahpl_mpq_invalid (qid)) + delete task_obj; + + return qid; + } + + static ahpl_mpq_t call (ahpl_mpqp_t qp, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task, void *task_result = NULL) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + ahpl_mpq_t qid = ahpl_mpqp_call (qp, ref, f_name, ____mpq_f, 2, task_obj, task_result); + if (ahpl_mpq_invalid (qid)) + delete task_obj; + + return qid; + } + + static ahpl_mpq_t run (ahpl_mpqp_t qp, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + ahpl_mpq_t qid = ahpl_mpqp_run (qp, AHPL_MPQ_INVALID, ref, f_name, ____mpq_f, 1, task_obj); + if (ahpl_mpq_invalid (qid)) + delete task_obj; + + return qid; + } + + static int pool_tail_queue (ahpl_mpqp_t qp, ahpl_mpq_t dq, ahpl_ref_t ref, const char *f_name, ahpl_ref_mpq_lambda_f&& task) + { + ahpl_ref_mpq_lambda_f *task_obj = new ahpl_ref_mpq_lambda_f (std::move (task)); + int err = ahpl_mpqp_pool_tail_queue (qp, dq, ref, f_name, ____mpq_f, 1, task_obj); + if (err < 0) + delete task_obj; + + return err; + } +#endif /* __AHPL_MPQP_H__ */ + + static void *call_result_var_addr (void) + { + void *var_addr; + + if (ahpl_mpq_run_func_arg (1, (uintptr_t *)&var_addr) < 0) + return NULL; + + return var_addr; + } + +private: + static void ____mpq_f (const ahpl_ts_t *queued_ts_p, ahpl_refobj_t robj, uintptr_t argc, uintptr_t argv []) + { + ahpl_ref_mpq_lambda_f *task_obj = reinterpret_cast(argv [0]); + ahpl_mpq_t done_qid = ahpl_mpq_run_func_done_qid (); + (*task_obj) (*queued_ts_p, robj); + if (ahpl_mpq_invalid (done_qid) || ahpl_is_free_only (robj)) { + /** + * We only free the task object when the running function has no + * done mpq id, due to the task object would be still in use if + * the function has a done mpq id when queuing back to the done + * mpq. + * -- Lionfore Hao Nov 19th, 2018 + **/ + delete task_obj; + } + } +#endif /* __AHPL_MPQ_H__ */ + +#ifdef __AHPL_ASYNC_H__ + /** + * The stackless coroutine like implementation in AHPL. We could not + * support the real stackless coroutine except in the language level, + * so we just provide similar equivalent functionals here. + **/ +public: + typedef std::function ahpl_async_prepare_lambda_f; + + int prepare (ahpl_stack_id_t stack_id, const char *f_name, ahpl_async_prepare_lambda_f&& task) + { + ahpl_async_prepare_lambda_f *prepare_f = new ahpl_async_prepare_lambda_f (std::move (task)); + int err = ahpl_async_prepare (stack_id, ref (), f_name, ____async_prepare_f, 1, prepare_f); + if (err < 0) + delete prepare_f; + + return err; + } + + static int prepare (ahpl_stack_id_t stack_id, ahpl_ref_t ref, const char *f_name, ahpl_async_prepare_lambda_f&& task) + { + ahpl_async_prepare_lambda_f *prepare_f = new ahpl_async_prepare_lambda_f (std::move (task)); + int err = ahpl_async_prepare (stack_id, ref, f_name, ____async_prepare_f, 1, prepare_f); + if (err < 0) + delete prepare_f; + + return err; + } + +private: + static int ____async_prepare_f (int free_only, uintptr_t argc, uintptr_t argv []) + { + ahpl_async_prepare_lambda_f *prepare_f = reinterpret_cast(argv [0]); + int err; + err = (*prepare_f) (free_only); + delete prepare_f; + return err; + } + +public: + typedef std::function ahpl_async_resume_lambda_f; + + int resume (ahpl_stack_id_t stack_id, const char *f_name, ahpl_async_resume_lambda_f&& task) + { + ahpl_async_resume_lambda_f *resume_f = new ahpl_async_resume_lambda_f (std::move (task)); + int err = ahpl_async_resume (stack_id, ref (), f_name, ____async_resume_f, 1, resume_f); + if (err < 0) + delete resume_f; + + return err; + } + + static int resume (ahpl_stack_id_t stack_id, ahpl_ref_t ref, const char *f_name, ahpl_async_resume_lambda_f&& task) + { + ahpl_async_resume_lambda_f *resume_f = new ahpl_async_resume_lambda_f (std::move (task)); + int err = ahpl_async_resume (stack_id, ref, f_name, ____async_resume_f, 1, resume_f); + if (err < 0) + delete resume_f; + + return err; + } + +private: + static void ____async_resume_f (int free_only, uintptr_t argc, uintptr_t argv []) + { + ahpl_async_resume_lambda_f *resume_f = reinterpret_cast(argv [0]); + (*resume_f) (free_only); + delete resume_f; + } +#endif /* __AHPL_ASYNC_H__ */ +#endif /* C++11 */ + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +private: + ahpl_ref_class (const ahpl_ref_class &) = delete; + ahpl_ref_class (ahpl_ref_class &&) = delete; + ahpl_ref_class &operator = (const ahpl_ref_class &) = delete; + ahpl_ref_class &operator = (ahpl_ref_class &&) = delete; +#else +private: + ahpl_ref_class (const ahpl_ref_class &); + ahpl_ref_class &operator = (const ahpl_ref_class &); +#endif /* C++11 */ +}; + + +/** + * The T_ref_cls argument of this template must be + * ahpl_ref_class or its derivatives. + **/ +template +class ahpl_ref_unique_ptr { +private: + T_ref_cls *_ptr; + +public: + ahpl_ref_unique_ptr (): _ptr (NULL) {} + ahpl_ref_unique_ptr (T_ref_cls *p): _ptr (p) {} + + ahpl_ref_unique_ptr &operator = (T_ref_cls *p) + { + reset (); + _ptr = p; + return *this; + } + + T_ref_cls *operator -> () const + { + return _ptr; + } + + T_ref_cls *get () const + { + return _ptr; + } + + operator bool () const + { + return _ptr != NULL; + } + + T_ref_cls *release () + { + T_ref_cls *p = _ptr; + _ptr = NULL; + return p; + } + + void reset (T_ref_cls *p = NULL) + { + T_ref_cls *old = _ptr; + + /** + * We do the destroy and not delete the object + * before we set the pointer to the new value, + * this is very important to make sure that no + * any async operation is executing. + **/ + if (old != NULL) + old->destroy (false/* not delete */); + + _ptr = p; + + /** + * The destroy with delete operation must be + * the last action, and don't touch any member + * of this object anymore after it. + **/ + if (old != NULL) + old->destroy (true/* do delete */); + } + + ~ahpl_ref_unique_ptr () + { + reset (); + } + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +private: + ahpl_ref_unique_ptr (const ahpl_ref_unique_ptr &) = delete; + ahpl_ref_unique_ptr &operator = (const ahpl_ref_unique_ptr &) = delete; + +public: + ahpl_ref_unique_ptr (ahpl_ref_unique_ptr &&src): _ptr (src.release ()) {} + ahpl_ref_unique_ptr &operator = (ahpl_ref_unique_ptr &&ptr) + { + reset (ptr.release ()); + return *this; + } +#else +private: + ahpl_ref_unique_ptr (const ahpl_ref_unique_ptr &); + ahpl_ref_unique_ptr &operator = (const ahpl_ref_unique_ptr &); +#endif /* C++11 */ +}; + + +template +inline bool operator == (const ahpl_ref_unique_ptr &ptr, intptr_t _null) +{ + return ptr.get () == (T_ref_cls *)_null; +} + +template +inline bool operator != (const ahpl_ref_unique_ptr &ptr, intptr_t _null) +{ + return ptr.get () != (T_ref_cls *)_null; +} + +template +inline bool operator == (intptr_t _null, const ahpl_ref_unique_ptr &ptr) +{ + return (T_ref_cls *)_null == ptr.get (); +} + +template +inline bool operator != (intptr_t _null, const ahpl_ref_unique_ptr &ptr) +{ + return (T_ref_cls *)_null != ptr.get (); +} + +#if (__cplusplus >= 201103) || defined (_MSC_VER) +template +inline bool operator == (const ahpl_ref_unique_ptr &ptr, nullptr_t) +{ + return !ptr; +} + +template +inline bool operator != (const ahpl_ref_unique_ptr &ptr, nullptr_t) +{ + return ptr; +} + +template +inline bool operator == (nullptr_t, const ahpl_ref_unique_ptr &ptr) +{ + return !ptr; +} + +template +inline bool operator != (nullptr_t, const ahpl_ref_unique_ptr &ptr) +{ + return ptr; +} +#endif /* C++11 */ + + +typedef ahpl_ref_unique_ptr ahpl_ref_class_unique_ptr; + + +#endif /* __AHPL_REF_OBJ_CPP_H__ */ \ No newline at end of file diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/CMakeLists.txt b/Android/APIExample/agora-simple-filter/src/main/cpp/CMakeLists.txt index 2f2c24c33..b052242a5 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/CMakeLists.txt +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/CMakeLists.txt @@ -11,15 +11,6 @@ project(agora-simple-filter) set(agora-lib-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-rtc-sdk.so) link_libraries(${agora-lib-so}) -set(agora-ffmpeg-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-ffmpeg.so) -link_libraries(${agora-ffmpeg-so}) - -set(agora-soundtouch-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-soundtouch.so) -link_libraries(${agora-soundtouch-so}) - -set(agora-fdkaac-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-fdkaac.so) -link_libraries(${agora-fdkaac-so}) - #link opencv so set(opencv-lib-so ${PROJECT_SOURCE_DIR}/../jniLibs/${CMAKE_ANDROID_ARCH_ABI}/libopencv_java4.so) link_libraries(${opencv-lib-so}) @@ -70,6 +61,7 @@ find_library( # Sets the name of the path variable. # build script, prebuilt third-party libraries, or system libraries. target_include_directories(agora-simple-filter PRIVATE ${PROJECT_SOURCE_DIR}) +target_include_directories(agora-simple-filter PRIVATE ${PROJECT_SOURCE_DIR}/AgoraRtcKit) target_link_libraries( # Specifies the target library. agora-simple-filter diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.cpp b/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.cpp index 387c1324d..17fd83d42 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.cpp +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.cpp @@ -33,10 +33,6 @@ namespace agora { if (!eglCore_) { eglCore_ = new EglCore(); offscreenSurface_ = eglCore_->createOffscreenSurface(640, 320); - - } - if (!eglCore_->isCurrent(offscreenSurface_)) { - eglCore_->makeCurrent(offscreenSurface_); } #endif return true; @@ -57,10 +53,36 @@ namespace agora { return true; } + bool WatermarkProcessor::makeCurrent() { + const std::lock_guard lock(mutex_); +#if defined(__ANDROID__) || defined(TARGET_OS_ANDROID) + if (eglCore_ && offscreenSurface_) { + if (!eglCore_->isCurrent(offscreenSurface_)) { + eglCore_->makeCurrent(offscreenSurface_); + } + return true; + } +#endif + return false; + } + + bool WatermarkProcessor::detachCurrent() { + const std::lock_guard lock(mutex_); +#if defined(__ANDROID__) || defined(TARGET_OS_ANDROID) + if (eglCore_) { + eglCore_->makeNothingCurrent(); + return true; + } +#endif + return false; + } + int WatermarkProcessor::processFrame(agora::rtc::VideoFrameData &capturedFrame) { // PRINTF_INFO("processFrame: w: %d, h: %d, r: %d, enable: %d", capturedFrame.width, capturedFrame.height, capturedFrame.rotation, wmEffectEnabled_); if (wmEffectEnabled_) { + makeCurrent(); addWatermark(capturedFrame); + detachCurrent(); } return 0; } diff --git a/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.h b/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.h index 01d85957e..75fc57ce2 100644 --- a/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.h +++ b/Android/APIExample/agora-simple-filter/src/main/cpp/plugin_source_code/VideoProcessor.h @@ -26,6 +26,10 @@ namespace agora { bool releaseOpenGL(); + bool makeCurrent(); + + bool detachCurrent(); + int processFrame(agora::rtc::VideoFrameData &capturedFrame); int setParameters(std::string parameter); diff --git a/Android/APIExample/agora-stream-encrypt/.gitignore b/Android/APIExample/agora-stream-encrypt/.gitignore new file mode 100644 index 000000000..42afabfd2 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/.gitignore @@ -0,0 +1 @@ +/build \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/build.gradle b/Android/APIExample/agora-stream-encrypt/build.gradle new file mode 100644 index 000000000..c54d70f49 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/build.gradle @@ -0,0 +1,48 @@ +apply plugin: 'com.android.library' + +android { + compileSdkVersion 32 + buildToolsVersion "32.0.0" + + defaultConfig { + minSdkVersion 21 + targetSdkVersion 32 + versionCode 1 + versionName "1.0" + + testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner" + consumerProguardFiles "consumer-rules.pro" + + externalNativeBuild { + cmake { + cppFlags "-std=c++14" + abiFilters "armeabi-v7a", "arm64-v8a" + arguments "-DANDROID_STL=c++_shared" + } + } + } + + buildTypes { + release { + minifyEnabled false + proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro' + } + } + + externalNativeBuild { + cmake { + path "src/main/cpp/CMakeLists.txt" + version "3.10.2" + + } + } + +} + +dependencies { + api fileTree(dir: "libs", include: ["*.jar", "*.aar"]) + implementation 'androidx.appcompat:appcompat:1.1.0' + testImplementation 'junit:junit:4.12' + androidTestImplementation 'androidx.test.ext:junit:1.1.1' + androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0' +} diff --git a/Android/APIExample/agora-stream-encrypt/consumer-rules.pro b/Android/APIExample/agora-stream-encrypt/consumer-rules.pro new file mode 100644 index 000000000..e69de29bb diff --git a/Android/APIExample/agora-stream-encrypt/proguard-rules.pro b/Android/APIExample/agora-stream-encrypt/proguard-rules.pro new file mode 100644 index 000000000..481bb4348 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/proguard-rules.pro @@ -0,0 +1,21 @@ +# Add project specific ProGuard rules here. +# You can control the set of applied configuration files using the +# proguardFiles setting in build.gradle. +# +# For more details, see +# http://developer.android.com/guide/developing/tools/proguard.html + +# If your project uses WebView with JS, uncomment the following +# and specify the fully qualified class name to the JavaScript interface +# class: +#-keepclassmembers class fqcn.of.javascript.interface.for.webview { +# public *; +#} + +# Uncomment this to preserve the line number information for +# debugging stack traces. +#-keepattributes SourceFile,LineNumberTable + +# If you keep the line number information, uncomment this to +# hide the original source file name. +#-renamesourcefileattribute SourceFile \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/AndroidManifest.xml b/Android/APIExample/agora-stream-encrypt/src/main/AndroidManifest.xml new file mode 100644 index 000000000..76f4571a9 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/AndroidManifest.xml @@ -0,0 +1,3 @@ + + + \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/agoraLibs b/Android/APIExample/agora-stream-encrypt/src/main/agoraLibs new file mode 120000 index 000000000..a4d04c322 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/agoraLibs @@ -0,0 +1 @@ +../../../../../sdk \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/CMakeLists.txt b/Android/APIExample/agora-stream-encrypt/src/main/cpp/CMakeLists.txt new file mode 100644 index 000000000..4547b6c56 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/CMakeLists.txt @@ -0,0 +1,72 @@ +# For more information about using CMake with Android Studio, read the +# documentation: https://d.android.com/studio/projects/add-native-code.html + +# Sets the minimum version of CMake required to build the native library. + +cmake_minimum_required(VERSION 3.4.1) + +project(agora-stream-encrypt) + +#link agora so +set(agora-lib-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-rtc-sdk.so) +link_libraries(${agora-lib-so}) + +set(agora-ffmpeg-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-ffmpeg.so) +link_libraries(${agora-ffmpeg-so}) + +set(agora-soundtouch-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-soundtouch.so) +link_libraries(${agora-soundtouch-so}) + +set(agora-fdkaac-so ${PROJECT_SOURCE_DIR}/../agoraLibs/${CMAKE_ANDROID_ARCH_ABI}/libagora-fdkaac.so) +link_libraries(${agora-fdkaac-so}) + + +set (CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++14") + + +# Creates and names a library, sets it as either STATIC +# or SHARED, and provides the relative paths to its source code. +# You can define multiple libraries, and CMake builds them for you. +# Gradle automatically packages shared libraries with your APK. + +add_library( # Sets the name of the library. + agora-stream-encrypt + + # Sets the library as a shared library. + SHARED + # Provides a relative path to your source file(s). + packet_processing_plugin_jni.cpp) + +# Searches for a specified prebuilt library and stores the path as a +# variable. Because CMake includes system libraries in the search path by +# default, you only need to specify the name of the public NDK library +# you want to add. CMake verifies that the library exists before +# completing its build. + +find_library( # Sets the name of the path variable. + log-lib + + # Specifies the name of the NDK library that + # you want CMake to locate. + log ) + +# Specifies libraries CMake should link to your target library. You +# can link multiple libraries, such as libraries you define in this +# build script, prebuilt third-party libraries, or system libraries. + +target_include_directories(agora-stream-encrypt PRIVATE ${PROJECT_SOURCE_DIR}) + +target_link_libraries( # Specifies the target library. + agora-stream-encrypt + ${agora-lib-so} + ${agora-ffmpeg-so} + ${agora-soundtouch-so} + ${agora-fdkaac-so} + + # Links the target library to the log library + # included in the NDK. + ${log-lib} +# ${GLESv2} + ${GLESv3} + EGL + android) \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraBase.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraBase.h new file mode 100644 index 000000000..a3b4647a6 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraBase.h @@ -0,0 +1,6211 @@ +// +// Agora Engine SDK +// +// Created by Sting Feng in 2017-11. +// Copyright (c) 2017 Agora.io. All rights reserved. +// + +// This header file is included by both high level and low level APIs, +#pragma once // NOLINT(build/header_guard) + +#include +#include +#include +#include +#include + +#include "IAgoraParameter.h" +#include "AgoraMediaBase.h" +#include "AgoraRefPtr.h" +#include "AgoraOptional.h" + +#define MAX_PATH_260 (260) + +#if defined(_WIN32) + +#ifndef WIN32_LEAN_AND_MEAN +#define WIN32_LEAN_AND_MEAN +#endif // !WIN32_LEAN_AND_MEAN +#if defined(__aarch64__) +#include +#endif +#include + +#if defined(AGORARTC_EXPORT) +#define AGORA_API extern "C" __declspec(dllexport) +#else +#define AGORA_API extern "C" __declspec(dllimport) +#endif // AGORARTC_EXPORT + +#define AGORA_CALL __cdecl + +#define __deprecated + +#elif defined(__APPLE__) + +#include + +#define AGORA_API extern "C" __attribute__((visibility("default"))) +#define AGORA_CALL + +#elif defined(__ANDROID__) || defined(__linux__) + +#define AGORA_API extern "C" __attribute__((visibility("default"))) +#define AGORA_CALL + +#define __deprecated + +#else // !_WIN32 && !__APPLE__ && !(__ANDROID__ || __linux__) + +#define AGORA_API extern "C" +#define AGORA_CALL + +#define __deprecated + +#endif // _WIN32 + +#ifndef OPTIONAL_ENUM_SIZE_T +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#define OPTIONAL_ENUM_SIZE_T enum : size_t +#else +#define OPTIONAL_ENUM_SIZE_T enum +#endif +#endif + +#ifndef OPTIONAL_NULLPTR +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#define OPTIONAL_NULLPTR nullptr +#else +#define OPTIONAL_NULLPTR NULL +#endif +#endif + +#define INVALID_DISPLAY_ID (-2) + +namespace agora { +namespace util { + +template +class AutoPtr { + protected: + typedef T value_type; + typedef T* pointer_type; + + public: + explicit AutoPtr(pointer_type p = OPTIONAL_NULLPTR) : ptr_(p) {} + + ~AutoPtr() { + if (ptr_) { + ptr_->release(); + ptr_ = OPTIONAL_NULLPTR; + } + } + + operator bool() const { return (ptr_ != OPTIONAL_NULLPTR); } + + value_type& operator*() const { return *get(); } + + pointer_type operator->() const { return get(); } + + pointer_type get() const { return ptr_; } + + pointer_type release() { + pointer_type ret = ptr_; + ptr_ = 0; + return ret; + } + + void reset(pointer_type ptr = OPTIONAL_NULLPTR) { + if (ptr != ptr_ && ptr_) { + ptr_->release(); + } + + ptr_ = ptr; + } + + template + bool queryInterface(C1* c, C2 iid) { + pointer_type p = OPTIONAL_NULLPTR; + if (c && !c->queryInterface(iid, reinterpret_cast(&p))) { + reset(p); + } + + return (p != OPTIONAL_NULLPTR); + } + + private: + AutoPtr(const AutoPtr&); + AutoPtr& operator=(const AutoPtr&); + + private: + pointer_type ptr_; +}; + +template +class CopyableAutoPtr : public AutoPtr { + typedef typename AutoPtr::pointer_type pointer_type; + + public: + explicit CopyableAutoPtr(pointer_type p = 0) : AutoPtr(p) {} + CopyableAutoPtr(const CopyableAutoPtr& rhs) { this->reset(rhs.clone()); } + CopyableAutoPtr& operator=(const CopyableAutoPtr& rhs) { + if (this != &rhs) this->reset(rhs.clone()); + return *this; + } + pointer_type clone() const { + if (!this->get()) return OPTIONAL_NULLPTR; + return this->get()->clone(); + } +}; + +class IString { + public: + virtual bool empty() const = 0; + virtual const char* c_str() = 0; + virtual const char* data() = 0; + virtual size_t length() = 0; + virtual IString* clone() = 0; + virtual void release() = 0; + virtual ~IString() {} +}; +typedef CopyableAutoPtr AString; + +class IIterator { + public: + virtual void* current() = 0; + virtual const void* const_current() const = 0; + virtual bool next() = 0; + virtual void release() = 0; + virtual ~IIterator() {} +}; + +class IContainer { + public: + virtual IIterator* begin() = 0; + virtual size_t size() const = 0; + virtual void release() = 0; + virtual ~IContainer() {} +}; + +template +class AOutputIterator { + IIterator* p; + + public: + typedef T value_type; + typedef value_type& reference; + typedef const value_type& const_reference; + typedef value_type* pointer; + typedef const value_type* const_pointer; + explicit AOutputIterator(IIterator* it = OPTIONAL_NULLPTR) : p(it) {} + ~AOutputIterator() { + if (p) p->release(); + } + AOutputIterator(const AOutputIterator& rhs) : p(rhs.p) {} + AOutputIterator& operator++() { + p->next(); + return *this; + } + bool operator==(const AOutputIterator& rhs) const { + if (p && rhs.p) + return p->current() == rhs.p->current(); + else + return valid() == rhs.valid(); + } + bool operator!=(const AOutputIterator& rhs) const { return !this->operator==(rhs); } + reference operator*() { return *reinterpret_cast(p->current()); } + const_reference operator*() const { return *reinterpret_cast(p->const_current()); } + bool valid() const { return p && p->current() != OPTIONAL_NULLPTR; } +}; + +template +class AList { + IContainer* container; + bool owner; + + public: + typedef T value_type; + typedef value_type& reference; + typedef const value_type& const_reference; + typedef value_type* pointer; + typedef const value_type* const_pointer; + typedef size_t size_type; + typedef AOutputIterator iterator; + typedef const AOutputIterator const_iterator; + + public: + AList() : container(OPTIONAL_NULLPTR), owner(false) {} + AList(IContainer* c, bool take_ownership) : container(c), owner(take_ownership) {} + ~AList() { reset(); } + void reset(IContainer* c = OPTIONAL_NULLPTR, bool take_ownership = false) { + if (owner && container) container->release(); + container = c; + owner = take_ownership; + } + iterator begin() { return container ? iterator(container->begin()) : iterator(OPTIONAL_NULLPTR); } + iterator end() { return iterator(OPTIONAL_NULLPTR); } + size_type size() const { return container ? container->size() : 0; } + bool empty() const { return size() == 0; } +}; + +} // namespace util + +/** + * The channel profile. + */ +enum CHANNEL_PROFILE_TYPE { + /** + * 0: Communication. + * + * This profile prioritizes smoothness and applies to the one-to-one scenario. + */ + CHANNEL_PROFILE_COMMUNICATION = 0, + /** + * 1: (Default) Live Broadcast. + * + * This profile prioritizes supporting a large audience in a live broadcast channel. + */ + CHANNEL_PROFILE_LIVE_BROADCASTING = 1, + /** + * 2: Gaming. + * @deprecated This profile is deprecated. + */ + CHANNEL_PROFILE_GAME __deprecated = 2, + /** + * 3: Cloud Gaming. + * + * @deprecated This profile is deprecated. + */ + CHANNEL_PROFILE_CLOUD_GAMING __deprecated = 3, + + /** + * 4: Communication 1v1. + * @deprecated This profile is deprecated. + */ + CHANNEL_PROFILE_COMMUNICATION_1v1 __deprecated = 4, +}; + +/** + * The warning codes. + */ +enum WARN_CODE_TYPE { + /** + * 8: The specified view is invalid. To use the video function, you need to specify + * a valid view. + */ + WARN_INVALID_VIEW = 8, + /** + * 16: Fails to initialize the video function, probably due to a lack of + * resources. Users fail to see each other, but can still communicate with voice. + */ + WARN_INIT_VIDEO = 16, + /** + * 20: The request is pending, usually because some module is not ready, + * and the SDK postpones processing the request. + */ + WARN_PENDING = 20, + /** + * 103: No channel resources are available, probably because the server cannot + * allocate any channel resource. + */ + WARN_NO_AVAILABLE_CHANNEL = 103, + /** + * 104: A timeout occurs when looking for the channel. When joining a channel, + * the SDK looks up the specified channel. This warning usually occurs when the + * network condition is too poor to connect to the server. + */ + WARN_LOOKUP_CHANNEL_TIMEOUT = 104, + /** + * 105: The server rejects the request to look for the channel. The server + * cannot process this request or the request is illegal. + */ + WARN_LOOKUP_CHANNEL_REJECTED = 105, + /** + * 106: A timeout occurs when opening the channel. Once the specific channel + * is found, the SDK opens the channel. This warning usually occurs when the + * network condition is too poor to connect to the server. + */ + WARN_OPEN_CHANNEL_TIMEOUT = 106, + /** + * 107: The server rejects the request to open the channel. The server + * cannot process this request or the request is illegal. + */ + WARN_OPEN_CHANNEL_REJECTED = 107, + + // sdk: 100~1000 + /** + * 111: A timeout occurs when switching the live video. + */ + WARN_SWITCH_LIVE_VIDEO_TIMEOUT = 111, + /** + * 118: A timeout occurs when setting the user role. + */ + WARN_SET_CLIENT_ROLE_TIMEOUT = 118, + /** + * 121: The ticket to open the channel is invalid. + */ + WARN_OPEN_CHANNEL_INVALID_TICKET = 121, + /** + * 122: The SDK is trying connecting to another server. + */ + WARN_OPEN_CHANNEL_TRY_NEXT_VOS = 122, + /** + * 131: The channel connection cannot be recovered. + */ + WARN_CHANNEL_CONNECTION_UNRECOVERABLE = 131, + /** + * 132: The SDK connection IP has changed. + */ + WARN_CHANNEL_CONNECTION_IP_CHANGED = 132, + /** + * 133: The SDK connection port has changed. + */ + WARN_CHANNEL_CONNECTION_PORT_CHANGED = 133, + /** 134: The socket error occurs, try to rejoin channel. + */ + WARN_CHANNEL_SOCKET_ERROR = 134, + /** + * 701: An error occurs when opening the file for audio mixing. + */ + WARN_AUDIO_MIXING_OPEN_ERROR = 701, + /** + * 1014: Audio Device Module: An exception occurs in the playback device. + */ + WARN_ADM_RUNTIME_PLAYOUT_WARNING = 1014, + /** + * 1016: Audio Device Module: A warning occurs in the recording device. + */ + WARN_ADM_RUNTIME_RECORDING_WARNING = 1016, + /** + * 1019: Audio Device Module: No valid audio data is collected. + */ + WARN_ADM_RECORD_AUDIO_SILENCE = 1019, + /** + * 1020: Audio Device Module: The playback device fails to start. + */ + WARN_ADM_PLAYOUT_MALFUNCTION = 1020, + /** + * 1021: Audio Device Module: The recording device fails to start. + */ + WARN_ADM_RECORD_MALFUNCTION = 1021, + /** + * 1031: Audio Device Module: The recorded audio volume is too low. + */ + WARN_ADM_RECORD_AUDIO_LOWLEVEL = 1031, + /** + * 1032: Audio Device Module: The playback audio volume is too low. + */ + WARN_ADM_PLAYOUT_AUDIO_LOWLEVEL = 1032, + /** + * 1040: Audio device module: An exception occurs with the audio drive. + * Choose one of the following solutions: + * - Disable or re-enable the audio device. + * - Re-enable your device. + * - Update the sound card drive. + */ + WARN_ADM_WINDOWS_NO_DATA_READY_EVENT = 1040, + /** + * 1051: Audio Device Module: The SDK detects howling. + */ + WARN_APM_HOWLING = 1051, + /** + * 1052: Audio Device Module: The audio device is in a glitching state. + */ + WARN_ADM_GLITCH_STATE = 1052, + /** + * 1053: Audio Device Module: The settings are improper. + */ + WARN_ADM_IMPROPER_SETTINGS = 1053, + /** + * 1322: No recording device. + */ + WARN_ADM_WIN_CORE_NO_RECORDING_DEVICE = 1322, + /** + * 1323: Audio device module: No available playback device. + * You can try plugging in the audio device. + */ + WARN_ADM_WIN_CORE_NO_PLAYOUT_DEVICE = 1323, + /** + * 1324: Audio device module: The capture device is released improperly. + * Choose one of the following solutions: + * - Disable or re-enable the audio device. + * - Re-enable your audio device. + * - Update the sound card drive. + */ + WARN_ADM_WIN_CORE_IMPROPER_CAPTURE_RELEASE = 1324, +}; + +/** + * The error codes. + */ +enum ERROR_CODE_TYPE { + /** + * 0: No error occurs. + */ + ERR_OK = 0, + // 1~1000 + /** + * 1: A general error occurs (no specified reason). + */ + ERR_FAILED = 1, + /** + * 2: The argument is invalid. For example, the specific channel name + * includes illegal characters. + */ + ERR_INVALID_ARGUMENT = 2, + /** + * 3: The SDK module is not ready. Choose one of the following solutions: + * - Check the audio device. + * - Check the completeness of the app. + * - Reinitialize the RTC engine. + */ + ERR_NOT_READY = 3, + /** + * 4: The SDK does not support this function. + */ + ERR_NOT_SUPPORTED = 4, + /** + * 5: The request is rejected. + */ + ERR_REFUSED = 5, + /** + * 6: The buffer size is not big enough to store the returned data. + */ + ERR_BUFFER_TOO_SMALL = 6, + /** + * 7: The SDK is not initialized before calling this method. + */ + ERR_NOT_INITIALIZED = 7, + /** + * 8: The state is invalid. + */ + ERR_INVALID_STATE = 8, + /** + * 9: No permission. This is for internal use only, and does + * not return to the app through any method or callback. + */ + ERR_NO_PERMISSION = 9, + /** + * 10: An API timeout occurs. Some API methods require the SDK to return the + * execution result, and this error occurs if the request takes too long + * (more than 10 seconds) for the SDK to process. + */ + ERR_TIMEDOUT = 10, + /** + * 11: The request is cancelled. This is for internal use only, + * and does not return to the app through any method or callback. + */ + ERR_CANCELED = 11, + /** + * 12: The method is called too often. This is for internal use + * only, and does not return to the app through any method or + * callback. + */ + ERR_TOO_OFTEN = 12, + /** + * 13: The SDK fails to bind to the network socket. This is for internal + * use only, and does not return to the app through any method or + * callback. + */ + ERR_BIND_SOCKET = 13, + /** + * 14: The network is unavailable. This is for internal use only, and + * does not return to the app through any method or callback. + */ + ERR_NET_DOWN = 14, + /** + * 17: The request to join the channel is rejected. This error usually occurs + * when the user is already in the channel, and still calls the method to join + * the channel, for example, \ref agora::rtc::IRtcEngine::joinChannel "joinChannel()". + */ + ERR_JOIN_CHANNEL_REJECTED = 17, + /** + * 18: The request to leave the channel is rejected. This error usually + * occurs when the user has already left the channel, and still calls the + * method to leave the channel, for example, \ref agora::rtc::IRtcEngine::leaveChannel + * "leaveChannel". + */ + ERR_LEAVE_CHANNEL_REJECTED = 18, + /** + * 19: The resources have been occupied and cannot be reused. + */ + ERR_ALREADY_IN_USE = 19, + /** + * 20: The SDK gives up the request due to too many requests. This is for + * internal use only, and does not return to the app through any method or callback. + */ + ERR_ABORTED = 20, + /** + * 21: On Windows, specific firewall settings can cause the SDK to fail to + * initialize and crash. + */ + ERR_INIT_NET_ENGINE = 21, + /** + * 22: The app uses too much of the system resource and the SDK + * fails to allocate any resource. + */ + ERR_RESOURCE_LIMITED = 22, + /** + * 101: The App ID is invalid, usually because the data format of the App ID is incorrect. + * + * Solution: Check the data format of your App ID. Ensure that you use the correct App ID to initialize the Agora service. + */ + ERR_INVALID_APP_ID = 101, + /** + * 102: The specified channel name is invalid. Please try to rejoin the + * channel with a valid channel name. + */ + ERR_INVALID_CHANNEL_NAME = 102, + /** + * 103: Fails to get server resources in the specified region. Please try to + * specify another region when calling \ref agora::rtc::IRtcEngine::initialize + * "initialize". + */ + ERR_NO_SERVER_RESOURCES = 103, + /** + * 109: The token has expired, usually for the following reasons: + * - Timeout for token authorization: Once a token is generated, you must use it to access the + * Agora service within 24 hours. Otherwise, the token times out and you can no longer use it. + * - The token privilege expires: To generate a token, you need to set a timestamp for the token + * privilege to expire. For example, If you set it as seven days, the token expires seven days after + * its usage. In that case, you can no longer access the Agora service. The users cannot make calls, + * or are kicked out of the channel. + * + * Solution: Regardless of whether token authorization times out or the token privilege expires, + * you need to generate a new token on your server, and try to join the channel. + */ + ERR_TOKEN_EXPIRED = 109, + /** + * 110: The token is invalid, usually for one of the following reasons: + * - Did not provide a token when joining a channel in a situation where the project has enabled the + * App Certificate. + * - Tried to join a channel with a token in a situation where the project has not enabled the App + * Certificate. + * - The App ID, user ID and channel name that you use to generate the token on the server do not match + * those that you use when joining a channel. + * + * Solution: + * - Before joining a channel, check whether your project has enabled the App certificate. If yes, you + * must provide a token when joining a channel; if no, join a channel without a token. + * - When using a token to join a channel, ensure that the App ID, user ID, and channel name that you + * use to generate the token is the same as the App ID that you use to initialize the Agora service, and + * the user ID and channel name that you use to join the channel. + */ + ERR_INVALID_TOKEN = 110, + /** + * 111: The internet connection is interrupted. This applies to the Agora Web + * SDK only. + */ + ERR_CONNECTION_INTERRUPTED = 111, // only used in web sdk + /** + * 112: The internet connection is lost. This applies to the Agora Web SDK + * only. + */ + ERR_CONNECTION_LOST = 112, // only used in web sdk + /** + * 113: The user is not in the channel when calling the + * \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage()" method. + */ + ERR_NOT_IN_CHANNEL = 113, + /** + * 114: The data size is over 1024 bytes when the user calls the + * \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage()" method. + */ + ERR_SIZE_TOO_LARGE = 114, + /** + * 115: The bitrate of the sent data exceeds the limit of 6 Kbps when the + * user calls the \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage()". + */ + ERR_BITRATE_LIMIT = 115, + /** + * 116: Too many data streams (over 5) are created when the user + * calls the \ref agora::rtc::IRtcEngine::createDataStream "createDataStream()" method. + */ + ERR_TOO_MANY_DATA_STREAMS = 116, + /** + * 117: A timeout occurs for the data stream transmission. + */ + ERR_STREAM_MESSAGE_TIMEOUT = 117, + /** + * 119: Switching the user role fails. Please try to rejoin the channel. + */ + ERR_SET_CLIENT_ROLE_NOT_AUTHORIZED = 119, + /** + * 120: Decryption fails. The user may have tried to join the channel with a wrong + * password. Check your settings or try rejoining the channel. + */ + ERR_DECRYPTION_FAILED = 120, + /** + * 121: The user ID is invalid. + */ + ERR_INVALID_USER_ID = 121, + /** + * 123: The app is banned by the server. + */ + ERR_CLIENT_IS_BANNED_BY_SERVER = 123, + /** + * 130: Encryption is enabled when the user calls the + * \ref agora::rtc::IRtcEngine::addPublishStreamUrl "addPublishStreamUrl()" method + * (CDN live streaming does not support encrypted streams). + */ + ERR_ENCRYPTED_STREAM_NOT_ALLOWED_PUBLISH = 130, + + /** + * 131: License credential is invalid + */ + ERR_LICENSE_CREDENTIAL_INVALID = 131, + + /** + * 134: The user account is invalid, usually because the data format of the user account is incorrect. + */ + ERR_INVALID_USER_ACCOUNT = 134, + + /** 157: The necessary dynamical library is not integrated. For example, if you call + * the \ref agora::rtc::IRtcEngine::enableDeepLearningDenoise "enableDeepLearningDenoise" but do not integrate the dynamical + * library for the deep-learning noise reduction into your project, the SDK reports this error code. + * + */ + ERR_MODULE_NOT_FOUND = 157, + + // Licensing, keep the license error code same as the main version + ERR_CERT_RAW = 157, + ERR_CERT_JSON_PART = 158, + ERR_CERT_JSON_INVAL = 159, + ERR_CERT_JSON_NOMEM = 160, + ERR_CERT_CUSTOM = 161, + ERR_CERT_CREDENTIAL = 162, + ERR_CERT_SIGN = 163, + ERR_CERT_FAIL = 164, + ERR_CERT_BUF = 165, + ERR_CERT_NULL = 166, + ERR_CERT_DUEDATE = 167, + ERR_CERT_REQUEST = 168, + + // PcmSend Error num + ERR_PCMSEND_FORMAT = 200, // unsupport pcm format + ERR_PCMSEND_BUFFEROVERFLOW = 201, // buffer overflow, the pcm send rate too quickly + + /// @cond + // signaling: 400~600 + ERR_LOGIN_ALREADY_LOGIN = 428, + + /// @endcond + // 1001~2000 + /** + * 1001: Fails to load the media engine. + */ + ERR_LOAD_MEDIA_ENGINE = 1001, + /** + * 1005: Audio device module: A general error occurs in the Audio Device Module (no specified + * reason). Check if the audio device is used by another app, or try + * rejoining the channel. + */ + ERR_ADM_GENERAL_ERROR = 1005, + /** + * 1008: Audio Device Module: An error occurs in initializing the playback + * device. + */ + ERR_ADM_INIT_PLAYOUT = 1008, + /** + * 1009: Audio Device Module: An error occurs in starting the playback device. + */ + ERR_ADM_START_PLAYOUT = 1009, + /** + * 1010: Audio Device Module: An error occurs in stopping the playback device. + */ + ERR_ADM_STOP_PLAYOUT = 1010, + /** + * 1011: Audio Device Module: An error occurs in initializing the recording + * device. + */ + ERR_ADM_INIT_RECORDING = 1011, + /** + * 1012: Audio Device Module: An error occurs in starting the recording device. + */ + ERR_ADM_START_RECORDING = 1012, + /** + * 1013: Audio Device Module: An error occurs in stopping the recording device. + */ + ERR_ADM_STOP_RECORDING = 1013, + /** + * 1501: Video Device Module: The camera is not authorized. + */ + ERR_VDM_CAMERA_NOT_AUTHORIZED = 1501, +}; + +enum LICENSE_ERROR_TYPE { + /** + * 1: Invalid license + */ + LICENSE_ERR_INVALID = 1, + /** + * 2: License expired + */ + LICENSE_ERR_EXPIRE = 2, + /** + * 3: Exceed license minutes limit + */ + LICENSE_ERR_MINUTES_EXCEED = 3, + /** + * 4: License use in limited period + */ + LICENSE_ERR_LIMITED_PERIOD = 4, + /** + * 5: Same license used in different devices at the same time + */ + LICENSE_ERR_DIFF_DEVICES = 5, + /** + * 99: SDK internal error + */ + LICENSE_ERR_INTERNAL = 99, +}; + +/** + * The operational permission of the SDK on the audio session. + */ +enum AUDIO_SESSION_OPERATION_RESTRICTION { + /** + * 0: No restriction; the SDK can change the audio session. + */ + AUDIO_SESSION_OPERATION_RESTRICTION_NONE = 0, + /** + * 1: The SDK cannot change the audio session category. + */ + AUDIO_SESSION_OPERATION_RESTRICTION_SET_CATEGORY = 1, + /** + * 2: The SDK cannot change the audio session category, mode, or categoryOptions. + */ + AUDIO_SESSION_OPERATION_RESTRICTION_CONFIGURE_SESSION = 1 << 1, + /** + * 4: The SDK keeps the audio session active when the user leaves the + * channel, for example, to play an audio file in the background. + */ + AUDIO_SESSION_OPERATION_RESTRICTION_DEACTIVATE_SESSION = 1 << 2, + /** + * 128: Completely restricts the operational permission of the SDK on the + * audio session; the SDK cannot change the audio session. + */ + AUDIO_SESSION_OPERATION_RESTRICTION_ALL = 1 << 7, +}; + +typedef const char* user_id_t; +typedef void* view_t; + +/** + * The definition of the UserInfo struct. + */ +struct UserInfo { + /** + * ID of the user. + */ + util::AString userId; + /** + * Whether the user has enabled audio: + * - true: The user has enabled audio. + * - false: The user has disabled audio. + */ + bool hasAudio; + /** + * Whether the user has enabled video: + * - true: The user has enabled video. + * - false: The user has disabled video. + */ + bool hasVideo; + + UserInfo() : hasAudio(false), hasVideo(false) {} +}; + +typedef util::AList UserList; + +// Shared between Agora Service and Rtc Engine +namespace rtc { + +/** + * Reasons for a user being offline. + */ +enum USER_OFFLINE_REASON_TYPE { + /** + * 0: The user leaves the current channel. + */ + USER_OFFLINE_QUIT = 0, + /** + * 1: The SDK times out and the user drops offline because no data packet was received within a certain + * period of time. If a user quits the call and the message is not passed to the SDK (due to an + * unreliable channel), the SDK assumes that the user drops offline. + */ + USER_OFFLINE_DROPPED = 1, + /** + * 2: The user switches the client role from the host to the audience. + */ + USER_OFFLINE_BECOME_AUDIENCE = 2, +}; + +enum INTERFACE_ID_TYPE { + AGORA_IID_AUDIO_DEVICE_MANAGER = 1, + AGORA_IID_VIDEO_DEVICE_MANAGER = 2, + AGORA_IID_PARAMETER_ENGINE = 3, + AGORA_IID_MEDIA_ENGINE = 4, + AGORA_IID_AUDIO_ENGINE = 5, + AGORA_IID_VIDEO_ENGINE = 6, + AGORA_IID_RTC_CONNECTION = 7, + AGORA_IID_SIGNALING_ENGINE = 8, + AGORA_IID_MEDIA_ENGINE_REGULATOR = 9, + AGORA_IID_CLOUD_SPATIAL_AUDIO = 10, + AGORA_IID_LOCAL_SPATIAL_AUDIO = 11, + AGORA_IID_STATE_SYNC = 13, + AGORA_IID_META_SERVICE = 14, + AGORA_IID_MUSIC_CONTENT_CENTER = 15, + AGORA_IID_H265_TRANSCODER = 16, +}; + +/** + * The network quality types. + */ +enum QUALITY_TYPE { + /** + * 0: The network quality is unknown. + * @deprecated This member is deprecated. + */ + QUALITY_UNKNOWN __deprecated = 0, + /** + * 1: The quality is excellent. + */ + QUALITY_EXCELLENT = 1, + /** + * 2: The quality is quite good, but the bitrate may be slightly + * lower than excellent. + */ + QUALITY_GOOD = 2, + /** + * 3: Users can feel the communication slightly impaired. + */ + QUALITY_POOR = 3, + /** + * 4: Users cannot communicate smoothly. + */ + QUALITY_BAD = 4, + /** + * 5: Users can barely communicate. + */ + QUALITY_VBAD = 5, + /** + * 6: Users cannot communicate at all. + */ + QUALITY_DOWN = 6, + /** + * 7: (For future use) The network quality cannot be detected. + */ + QUALITY_UNSUPPORTED = 7, + /** + * 8: Detecting the network quality. + */ + QUALITY_DETECTING = 8, +}; + +/** + * Content fit modes. + */ +enum FIT_MODE_TYPE { + /** + * 1: Uniformly scale the video until it fills the visible boundaries (cropped). + * One dimension of the video may have clipped contents. + */ + MODE_COVER = 1, + + /** + * 2: Uniformly scale the video until one of its dimension fits the boundary + * (zoomed to fit). Areas that are not filled due to disparity in the aspect + * ratio are filled with black. + */ + MODE_CONTAIN = 2, +}; + +/** + * The rotation information. + */ +enum VIDEO_ORIENTATION { + /** + * 0: Rotate the video by 0 degree clockwise. + */ + VIDEO_ORIENTATION_0 = 0, + /** + * 90: Rotate the video by 90 degrees clockwise. + */ + VIDEO_ORIENTATION_90 = 90, + /** + * 180: Rotate the video by 180 degrees clockwise. + */ + VIDEO_ORIENTATION_180 = 180, + /** + * 270: Rotate the video by 270 degrees clockwise. + */ + VIDEO_ORIENTATION_270 = 270 +}; + +/** + * The video frame rate. + */ +enum FRAME_RATE { + /** + * 1: 1 fps. + */ + FRAME_RATE_FPS_1 = 1, + /** + * 7: 7 fps. + */ + FRAME_RATE_FPS_7 = 7, + /** + * 10: 10 fps. + */ + FRAME_RATE_FPS_10 = 10, + /** + * 15: 15 fps. + */ + FRAME_RATE_FPS_15 = 15, + /** + * 24: 24 fps. + */ + FRAME_RATE_FPS_24 = 24, + /** + * 30: 30 fps. + */ + FRAME_RATE_FPS_30 = 30, + /** + * 60: 60 fps. Applies to Windows and macOS only. + */ + FRAME_RATE_FPS_60 = 60, +}; + +enum FRAME_WIDTH { + FRAME_WIDTH_960 = 960, +}; + +enum FRAME_HEIGHT { + FRAME_HEIGHT_540 = 540, +}; + + +/** + * Types of the video frame. + */ +enum VIDEO_FRAME_TYPE { + /** 0: A black frame. */ + VIDEO_FRAME_TYPE_BLANK_FRAME = 0, + /** 3: Key frame. */ + VIDEO_FRAME_TYPE_KEY_FRAME = 3, + /** 4: Delta frame. */ + VIDEO_FRAME_TYPE_DELTA_FRAME = 4, + /** 5: The B frame.*/ + VIDEO_FRAME_TYPE_B_FRAME = 5, + /** 6: A discarded frame. */ + VIDEO_FRAME_TYPE_DROPPABLE_FRAME = 6, + /** Unknown frame. */ + VIDEO_FRAME_TYPE_UNKNOW +}; + +/** + * Video output orientation modes. + */ +enum ORIENTATION_MODE { + /** + * 0: The output video always follows the orientation of the captured video. The receiver takes + * the rotational information passed on from the video encoder. This mode applies to scenarios + * where video orientation can be adjusted on the receiver: + * - If the captured video is in landscape mode, the output video is in landscape mode. + * - If the captured video is in portrait mode, the output video is in portrait mode. + */ + ORIENTATION_MODE_ADAPTIVE = 0, + /** + * 1: Landscape mode. In this mode, the SDK always outputs videos in landscape (horizontal) mode. + * If the captured video is in portrait mode, the video encoder crops it to fit the output. Applies + * to situations where the receiving end cannot process the rotational information. For example, + * CDN live streaming. + */ + ORIENTATION_MODE_FIXED_LANDSCAPE = 1, + /** + * 2: Portrait mode. In this mode, the SDK always outputs video in portrait (portrait) mode. If + * the captured video is in landscape mode, the video encoder crops it to fit the output. Applies + * to situations where the receiving end cannot process the rotational information. For example, + * CDN live streaming. + */ + ORIENTATION_MODE_FIXED_PORTRAIT = 2, +}; + +/** + * (For future use) Video degradation preferences under limited bandwidth. + */ +enum DEGRADATION_PREFERENCE { + /** + * 0: (Default) Prefers to reduce the video frame rate while maintaining video quality during video + * encoding under limited bandwidth. This degradation preference is suitable for scenarios where + * video quality is prioritized. + * @note In the COMMUNICATION channel profile, the resolution of the video sent may change, so + * remote users need to handle this issue. + */ + MAINTAIN_QUALITY = 0, + /** + * 1: Prefers to reduce the video quality while maintaining the video frame rate during video + * encoding under limited bandwidth. This degradation preference is suitable for scenarios where + * smoothness is prioritized and video quality is allowed to be reduced. + */ + MAINTAIN_FRAMERATE = 1, + /** + * 2: Reduces the video frame rate and video quality simultaneously during video encoding under + * limited bandwidth. MAINTAIN_BALANCED has a lower reduction than MAINTAIN_QUALITY and MAINTAIN_FRAMERATE, + * and this preference is suitable for scenarios where both smoothness and video quality are a + * priority. + */ + MAINTAIN_BALANCED = 2, + /** + * 3: Degrade framerate in order to maintain resolution. + */ + MAINTAIN_RESOLUTION = 3, + /** + * 4: Disable VQC adjustion. + */ + DISABLED = 100, +}; + +/** + * The definition of the VideoDimensions struct. + */ +struct VideoDimensions { + /** + * The width of the video, in pixels. + */ + int width; + /** + * The height of the video, in pixels. + */ + int height; + VideoDimensions() : width(640), height(480) {} + VideoDimensions(int w, int h) : width(w), height(h) {} + bool operator==(const VideoDimensions& rhs) const { + return width == rhs.width && height == rhs.height; + } +}; + +/** + * (Recommended) 0: Standard bitrate mode. + * + * In this mode, the video bitrate is twice the base bitrate. + */ +const int STANDARD_BITRATE = 0; + +/** + * -1: Compatible bitrate mode. + * + * In this mode, the video bitrate is the same as the base bitrate.. If you choose + * this mode in the live-broadcast profile, the video frame rate may be lower + * than the set value. + */ +const int COMPATIBLE_BITRATE = -1; + +/** + * -1: (For future use) The default minimum bitrate. + */ +const int DEFAULT_MIN_BITRATE = -1; + +/** + * -2: (For future use) Set minimum bitrate the same as target bitrate. + */ +const int DEFAULT_MIN_BITRATE_EQUAL_TO_TARGET_BITRATE = -2; + +/** + * screen sharing supported capability level. + */ +enum SCREEN_CAPTURE_FRAMERATE_CAPABILITY { + SCREEN_CAPTURE_FRAMERATE_CAPABILITY_15_FPS = 0, + SCREEN_CAPTURE_FRAMERATE_CAPABILITY_30_FPS = 1, + SCREEN_CAPTURE_FRAMERATE_CAPABILITY_60_FPS = 2, +}; + +/** + * Video codec capability levels. + */ +enum VIDEO_CODEC_CAPABILITY_LEVEL { + /** No specified level */ + CODEC_CAPABILITY_LEVEL_UNSPECIFIED = -1, + /** Only provide basic support for the codec type */ + CODEC_CAPABILITY_LEVEL_BASIC_SUPPORT = 5, + /** Can process 1080p video at a rate of approximately 30 fps. */ + CODEC_CAPABILITY_LEVEL_1080P30FPS = 10, + /** Can process 1080p video at a rate of approximately 60 fps. */ + CODEC_CAPABILITY_LEVEL_1080P60FPS = 20, + /** Can process 4k video at a rate of approximately 30 fps. */ + CODEC_CAPABILITY_LEVEL_4K60FPS = 30, +}; + +/** + * The video codec types. + */ +enum VIDEO_CODEC_TYPE { + VIDEO_CODEC_NONE = 0, + /** + * 1: Standard VP8. + */ + VIDEO_CODEC_VP8 = 1, + /** + * 2: Standard H.264. + */ + VIDEO_CODEC_H264 = 2, + /** + * 3: Standard H.265. + */ + VIDEO_CODEC_H265 = 3, + /** + * 6: Generic. This type is used for transmitting raw video data, such as encrypted video frames. + * The SDK returns this type of video frames in callbacks, and you need to decode and render the frames yourself. + */ + VIDEO_CODEC_GENERIC = 6, + /** + * 7: Generic H264. + */ + VIDEO_CODEC_GENERIC_H264 = 7, + /** + * 12: AV1. + */ + VIDEO_CODEC_AV1 = 12, + /** + * 5: VP9. + */ + VIDEO_CODEC_VP9 = 13, + /** + * 20: Generic JPEG. This type consumes minimum computing resources and applies to IoT devices. + */ + VIDEO_CODEC_GENERIC_JPEG = 20, +}; + +/** + * The CC (Congestion Control) mode options. + */ +enum TCcMode { + /** + * Enable CC mode. + */ + CC_ENABLED, + /** + * Disable CC mode. + */ + CC_DISABLED, +}; + +/** + * The configuration for creating a local video track with an encoded image sender. + */ +struct SenderOptions { + /** + * Whether to enable CC mode. See #TCcMode. + */ + TCcMode ccMode; + /** + * The codec type used for the encoded images: \ref agora::rtc::VIDEO_CODEC_TYPE "VIDEO_CODEC_TYPE". + */ + VIDEO_CODEC_TYPE codecType; + + /** + * Target bitrate (Kbps) for video encoding. + * + * Choose one of the following options: + * + * - \ref agora::rtc::STANDARD_BITRATE "STANDARD_BITRATE": (Recommended) Standard bitrate. + * - Communication profile: The encoding bitrate equals the base bitrate. + * - Live-broadcast profile: The encoding bitrate is twice the base bitrate. + * - \ref agora::rtc::COMPATIBLE_BITRATE "COMPATIBLE_BITRATE": Compatible bitrate. The bitrate stays the same + * regardless of the profile. + * + * The Communication profile prioritizes smoothness, while the Live Broadcast + * profile prioritizes video quality (requiring a higher bitrate). Agora + * recommends setting the bitrate mode as \ref agora::rtc::STANDARD_BITRATE "STANDARD_BITRATE" or simply to + * address this difference. + * + * The following table lists the recommended video encoder configurations, + * where the base bitrate applies to the communication profile. Set your + * bitrate based on this table. If the bitrate you set is beyond the proper + * range, the SDK automatically sets it to within the range. + + | Resolution | Frame Rate (fps) | Base Bitrate (Kbps, for Communication) | Live Bitrate (Kbps, for Live Broadcast)| + |------------------------|------------------|----------------------------------------|----------------------------------------| + | 160 × 120 | 15 | 65 | 130 | + | 120 × 120 | 15 | 50 | 100 | + | 320 × 180 | 15 | 140 | 280 | + | 180 × 180 | 15 | 100 | 200 | + | 240 × 180 | 15 | 120 | 240 | + | 320 × 240 | 15 | 200 | 400 | + | 240 × 240 | 15 | 140 | 280 | + | 424 × 240 | 15 | 220 | 440 | + | 640 × 360 | 15 | 400 | 800 | + | 360 × 360 | 15 | 260 | 520 | + | 640 × 360 | 30 | 600 | 1200 | + | 360 × 360 | 30 | 400 | 800 | + | 480 × 360 | 15 | 320 | 640 | + | 480 × 360 | 30 | 490 | 980 | + | 640 × 480 | 15 | 500 | 1000 | + | 480 × 480 | 15 | 400 | 800 | + | 640 × 480 | 30 | 750 | 1500 | + | 480 × 480 | 30 | 600 | 1200 | + | 848 × 480 | 15 | 610 | 1220 | + | 848 × 480 | 30 | 930 | 1860 | + | 640 × 480 | 10 | 400 | 800 | + | 1280 × 720 | 15 | 1130 | 2260 | + | 1280 × 720 | 30 | 1710 | 3420 | + | 960 × 720 | 15 | 910 | 1820 | + | 960 × 720 | 30 | 1380 | 2760 | + | 1920 × 1080 | 15 | 2080 | 4160 | + | 1920 × 1080 | 30 | 3150 | 6300 | + | 1920 × 1080 | 60 | 4780 | 6500 | + | 2560 × 1440 | 30 | 4850 | 6500 | + | 2560 × 1440 | 60 | 6500 | 6500 | + | 3840 × 2160 | 30 | 6500 | 6500 | + | 3840 × 2160 | 60 | 6500 | 6500 | + */ + int targetBitrate; + + SenderOptions() + : ccMode(CC_ENABLED), + codecType(VIDEO_CODEC_H265), + targetBitrate(6500) {} +}; + +/** + * Audio codec types. + */ +enum AUDIO_CODEC_TYPE { + /** + * 1: OPUS. + */ + AUDIO_CODEC_OPUS = 1, + // kIsac = 2, + /** + * 3: PCMA. + */ + AUDIO_CODEC_PCMA = 3, + /** + * 4: PCMU. + */ + AUDIO_CODEC_PCMU = 4, + /** + * 5: G722. + */ + AUDIO_CODEC_G722 = 5, + // kIlbc = 6, + /** 7: AAC. */ + // AUDIO_CODEC_AAC = 7, + /** + * 8: AAC LC. + */ + AUDIO_CODEC_AACLC = 8, + /** + * 9: HE AAC. + */ + AUDIO_CODEC_HEAAC = 9, + /** + * 10: JC1. + */ + AUDIO_CODEC_JC1 = 10, + /** + * 11: HE-AAC v2. + */ + AUDIO_CODEC_HEAAC2 = 11, + /** + * 12: LPCNET. + */ + AUDIO_CODEC_LPCNET = 12, +}; + +/** + * Audio encoding types of the audio encoded frame observer. + */ +enum AUDIO_ENCODING_TYPE { + /** + * AAC encoding format, 16000 Hz sampling rate, bass quality. A file with an audio duration of 10 + * minutes is approximately 1.2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_16000_LOW = 0x010101, + /** + * AAC encoding format, 16000 Hz sampling rate, medium sound quality. A file with an audio duration + * of 10 minutes is approximately 2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_16000_MEDIUM = 0x010102, + /** + * AAC encoding format, 32000 Hz sampling rate, bass quality. A file with an audio duration of 10 + * minutes is approximately 1.2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_32000_LOW = 0x010201, + /** + * AAC encoding format, 32000 Hz sampling rate, medium sound quality. A file with an audio duration + * of 10 minutes is approximately 2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_32000_MEDIUM = 0x010202, + /** + * AAC encoding format, 32000 Hz sampling rate, high sound quality. A file with an audio duration of + * 10 minutes is approximately 3.5 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_32000_HIGH = 0x010203, + /** + * AAC encoding format, 48000 Hz sampling rate, medium sound quality. A file with an audio duration + * of 10 minutes is approximately 2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_48000_MEDIUM = 0x010302, + /** + * AAC encoding format, 48000 Hz sampling rate, high sound quality. A file with an audio duration + * of 10 minutes is approximately 3.5 MB after encoding. + */ + AUDIO_ENCODING_TYPE_AAC_48000_HIGH = 0x010303, + /** + * OPUS encoding format, 16000 Hz sampling rate, bass quality. A file with an audio duration of 10 + * minutes is approximately 2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_OPUS_16000_LOW = 0x020101, + /** + * OPUS encoding format, 16000 Hz sampling rate, medium sound quality. A file with an audio duration + * of 10 minutes is approximately 2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_OPUS_16000_MEDIUM = 0x020102, + /** + * OPUS encoding format, 48000 Hz sampling rate, medium sound quality. A file with an audio duration + * of 10 minutes is approximately 2 MB after encoding. + */ + AUDIO_ENCODING_TYPE_OPUS_48000_MEDIUM = 0x020302, + /** + * OPUS encoding format, 48000 Hz sampling rate, high sound quality. A file with an audio duration of + * 10 minutes is approximately 3.5 MB after encoding. + */ + AUDIO_ENCODING_TYPE_OPUS_48000_HIGH = 0x020303, +}; + +/** + * The adaptation mode of the watermark. + */ +enum WATERMARK_FIT_MODE { + /** + * Use the `positionInLandscapeMode` and `positionInPortraitMode` values you set in #WatermarkOptions. + * The settings in `WatermarkRatio` are invalid. + */ + FIT_MODE_COVER_POSITION, + /** + * Use the value you set in `WatermarkRatio`. The settings in `positionInLandscapeMode` and `positionInPortraitMode` + * in `WatermarkOptions` are invalid. + */ + FIT_MODE_USE_IMAGE_RATIO +}; + +/** + * The advanced settings of encoded audio frame. + */ +struct EncodedAudioFrameAdvancedSettings { + EncodedAudioFrameAdvancedSettings() + : speech(true), + sendEvenIfEmpty(true) {} + + /** + * Determines whether the audio source is speech. + * - true: (Default) The audio source is speech. + * - false: The audio source is not speech. + */ + bool speech; + /** + * Whether to send the audio frame even when it is empty. + * - true: (Default) Send the audio frame even when it is empty. + * - false: Do not send the audio frame when it is empty. + */ + bool sendEvenIfEmpty; +}; + +/** + * The definition of the EncodedAudioFrameInfo struct. + */ +struct EncodedAudioFrameInfo { + EncodedAudioFrameInfo() + : codec(AUDIO_CODEC_AACLC), + sampleRateHz(0), + samplesPerChannel(0), + numberOfChannels(0), + captureTimeMs(0) {} + + EncodedAudioFrameInfo(const EncodedAudioFrameInfo& rhs) + : codec(rhs.codec), + sampleRateHz(rhs.sampleRateHz), + samplesPerChannel(rhs.samplesPerChannel), + numberOfChannels(rhs.numberOfChannels), + advancedSettings(rhs.advancedSettings), + captureTimeMs(rhs.captureTimeMs) {} + /** + * The audio codec: #AUDIO_CODEC_TYPE. + */ + AUDIO_CODEC_TYPE codec; + /** + * The sample rate (Hz) of the audio frame. + */ + int sampleRateHz; + /** + * The number of samples per audio channel. + * + * If this value is not set, it is 1024 for AAC, or 960 for OPUS by default. + */ + int samplesPerChannel; + /** + * The number of audio channels of the audio frame. + */ + int numberOfChannels; + /** + * The advanced settings of the audio frame. + */ + EncodedAudioFrameAdvancedSettings advancedSettings; + + /** + * This is a input parameter which means the timestamp for capturing the audio frame. + */ + int64_t captureTimeMs; +}; +/** + * The definition of the AudioPcmDataInfo struct. + */ +struct AudioPcmDataInfo { + AudioPcmDataInfo() : samplesPerChannel(0), channelNum(0), samplesOut(0), elapsedTimeMs(0), ntpTimeMs(0) {} + + AudioPcmDataInfo(const AudioPcmDataInfo& rhs) + : samplesPerChannel(rhs.samplesPerChannel), + channelNum(rhs.channelNum), + samplesOut(rhs.samplesOut), + elapsedTimeMs(rhs.elapsedTimeMs), + ntpTimeMs(rhs.ntpTimeMs) {} + + /** + * The sample count of the PCM data that you expect. + */ + size_t samplesPerChannel; + + int16_t channelNum; + + // Output + /** + * The number of output samples. + */ + size_t samplesOut; + /** + * The rendering time (ms). + */ + int64_t elapsedTimeMs; + /** + * The NTP (Network Time Protocol) timestamp (ms). + */ + int64_t ntpTimeMs; +}; +/** + * Packetization modes. Applies to H.264 only. + */ +enum H264PacketizeMode { + /** + * Non-interleaved mode. See RFC 6184. + */ + NonInterleaved = 0, // Mode 1 - STAP-A, FU-A is allowed + /** + * Single NAL unit mode. See RFC 6184. + */ + SingleNalUnit, // Mode 0 - only single NALU allowed +}; + +/** + * Video stream types. + */ +enum VIDEO_STREAM_TYPE { + /** + * 0: The high-quality video stream, which has a higher resolution and bitrate. + */ + VIDEO_STREAM_HIGH = 0, + /** + * 1: The low-quality video stream, which has a lower resolution and bitrate. + */ + VIDEO_STREAM_LOW = 1, +}; + +struct VideoSubscriptionOptions { + /** + * The type of the video stream to subscribe to. + * + * The default value is `VIDEO_STREAM_HIGH`, which means the high-quality + * video stream. + */ + Optional type; + /** + * Whether to subscribe to encoded video data only: + * - `true`: Subscribe to encoded video data only. + * - `false`: (Default) Subscribe to decoded video data. + */ + Optional encodedFrameOnly; + + VideoSubscriptionOptions() {} +}; + + +/** The maximum length of the user account. + */ +enum MAX_USER_ACCOUNT_LENGTH_TYPE +{ + /** The maximum length of the user account is 256 bytes. + */ + MAX_USER_ACCOUNT_LENGTH = 256 +}; + +/** + * The definition of the EncodedVideoFrameInfo struct, which contains the information of the external encoded video frame. + */ +struct EncodedVideoFrameInfo { + EncodedVideoFrameInfo() + : uid(0), + codecType(VIDEO_CODEC_H264), + width(0), + height(0), + framesPerSecond(0), + frameType(VIDEO_FRAME_TYPE_BLANK_FRAME), + rotation(VIDEO_ORIENTATION_0), + trackId(0), + captureTimeMs(0), + decodeTimeMs(0), + streamType(VIDEO_STREAM_HIGH) {} + + EncodedVideoFrameInfo(const EncodedVideoFrameInfo& rhs) + : uid(rhs.uid), + codecType(rhs.codecType), + width(rhs.width), + height(rhs.height), + framesPerSecond(rhs.framesPerSecond), + frameType(rhs.frameType), + rotation(rhs.rotation), + trackId(rhs.trackId), + captureTimeMs(rhs.captureTimeMs), + decodeTimeMs(rhs.decodeTimeMs), + streamType(rhs.streamType) {} + + EncodedVideoFrameInfo& operator=(const EncodedVideoFrameInfo& rhs) { + if (this == &rhs) return *this; + uid = rhs.uid; + codecType = rhs.codecType; + width = rhs.width; + height = rhs.height; + framesPerSecond = rhs.framesPerSecond; + frameType = rhs.frameType; + rotation = rhs.rotation; + trackId = rhs.trackId; + captureTimeMs = rhs.captureTimeMs; + decodeTimeMs = rhs.decodeTimeMs; + streamType = rhs.streamType; + return *this; + } + + /** + * ID of the user that pushes the the external encoded video frame.. + */ + uid_t uid; + /** + * The codec type of the local video stream. See #VIDEO_CODEC_TYPE. The default value is `VIDEO_CODEC_H265 (3)`. + */ + VIDEO_CODEC_TYPE codecType; + /** + * The width (px) of the video frame. + */ + int width; + /** + * The height (px) of the video frame. + */ + int height; + /** + * The number of video frames per second. + * When this parameter is not 0, you can use it to calculate the Unix timestamp of the external + * encoded video frames. + */ + int framesPerSecond; + /** + * The video frame type: #VIDEO_FRAME_TYPE. + */ + VIDEO_FRAME_TYPE frameType; + /** + * The rotation information of the video frame: #VIDEO_ORIENTATION. + */ + VIDEO_ORIENTATION rotation; + /** + * The track ID of the video frame. + */ + int trackId; // This can be reserved for multiple video tracks, we need to create different ssrc + // and additional payload for later implementation. + /** + * This is a input parameter which means the timestamp for capturing the video. + */ + int64_t captureTimeMs; + /** + * The timestamp for decoding the video. + */ + int64_t decodeTimeMs; + /** + * The stream type of video frame. + */ + VIDEO_STREAM_TYPE streamType; + +}; + +/** +* Video compression preference. +*/ +enum COMPRESSION_PREFERENCE { + /** + * (Default) Low latency is preferred, usually used in real-time communication where low latency is the number one priority. + */ + PREFER_LOW_LATENCY, + /** + * Prefer quality in sacrifice of a degree of latency, usually around 30ms ~ 150ms, depends target fps + */ + PREFER_QUALITY, +}; + +/** +* The video encoder type preference. +*/ +enum ENCODING_PREFERENCE { + /** + *Default . + */ + PREFER_AUTO = -1, + /** + * Software encoding. + */ + PREFER_SOFTWARE = 0, + /** + * Hardware encoding + */ + PREFER_HARDWARE = 1, +}; + +/** + * The definition of the AdvanceOptions struct. + */ +struct AdvanceOptions { + + /** + * The video encoder type preference.. + */ + ENCODING_PREFERENCE encodingPreference; + + /** + * Video compression preference. + */ + COMPRESSION_PREFERENCE compressionPreference; + + AdvanceOptions() : encodingPreference(PREFER_AUTO), + compressionPreference(PREFER_LOW_LATENCY) {} + + AdvanceOptions(ENCODING_PREFERENCE encoding_preference, + COMPRESSION_PREFERENCE compression_preference) : + encodingPreference(encoding_preference), + compressionPreference(compression_preference) {} + + bool operator==(const AdvanceOptions& rhs) const { + return encodingPreference == rhs.encodingPreference && + compressionPreference == rhs.compressionPreference; + } + +}; + +/** + * Video mirror mode types. + */ +enum VIDEO_MIRROR_MODE_TYPE { + /** + * 0: The mirror mode determined by the SDK. + */ + VIDEO_MIRROR_MODE_AUTO = 0, + /** + * 1: Enable the mirror mode. + */ + VIDEO_MIRROR_MODE_ENABLED = 1, + /** + * 2: Disable the mirror mode. + */ + VIDEO_MIRROR_MODE_DISABLED = 2, +}; + + +/** Supported codec type bit mask. */ +enum CODEC_CAP_MASK { + /** 0: No codec support. */ + CODEC_CAP_MASK_NONE = 0, + + /** bit 1: Hardware decoder support flag. */ + CODEC_CAP_MASK_HW_DEC = 1 << 0, + + /** bit 2: Hardware encoder support flag. */ + CODEC_CAP_MASK_HW_ENC = 1 << 1, + + /** bit 3: Software decoder support flag. */ + CODEC_CAP_MASK_SW_DEC = 1 << 2, + + /** bit 4: Software encoder support flag. */ + CODEC_CAP_MASK_SW_ENC = 1 << 3, +}; + +struct CodecCapLevels { + VIDEO_CODEC_CAPABILITY_LEVEL hwDecodingLevel; + VIDEO_CODEC_CAPABILITY_LEVEL swDecodingLevel; + + CodecCapLevels(): hwDecodingLevel(CODEC_CAPABILITY_LEVEL_UNSPECIFIED), swDecodingLevel(CODEC_CAPABILITY_LEVEL_UNSPECIFIED) {} +}; + +/** The codec support information. */ +struct CodecCapInfo { + /** The codec type: #VIDEO_CODEC_TYPE. */ + VIDEO_CODEC_TYPE codecType; + /** The codec support flag. */ + int codecCapMask; + /** The codec capability level, estimated based on the device hardware.*/ + CodecCapLevels codecLevels; + + CodecCapInfo(): codecType(VIDEO_CODEC_NONE), codecCapMask(0) {} +}; + +/** + * The definition of the VideoEncoderConfiguration struct. + */ +struct VideoEncoderConfiguration { + /** + * The video encoder code type: #VIDEO_CODEC_TYPE. + */ + VIDEO_CODEC_TYPE codecType; + /** + * The video dimension: VideoDimensions. + */ + VideoDimensions dimensions; + /** + * The frame rate of the video. You can set it manually, or choose one from #FRAME_RATE. + */ + int frameRate; + /** + * The bitrate (Kbps) of the video. + * + * Refer to the **Video Bitrate Table** below and set your bitrate. If you set a bitrate beyond the + * proper range, the SDK automatically adjusts it to a value within the range. You can also choose + * from the following options: + * + * - #STANDARD_BITRATE: (Recommended) Standard bitrate mode. In this mode, the bitrates differ between + * the Live Broadcast and Communication profiles: + * - In the Communication profile, the video bitrate is the same as the base bitrate. + * - In the Live Broadcast profile, the video bitrate is twice the base bitrate. + * - #COMPATIBLE_BITRATE: Compatible bitrate mode. The compatible bitrate mode. In this mode, the bitrate + * stays the same regardless of the profile. If you choose this mode for the Live Broadcast profile, + * the video frame rate may be lower than the set value. + * + * Agora uses different video codecs for different profiles to optimize the user experience. For example, + * the communication profile prioritizes the smoothness while the live-broadcast profile prioritizes the + * video quality (a higher bitrate). Therefore, We recommend setting this parameter as #STANDARD_BITRATE. + * + * | Resolution | Frame Rate (fps) | Base Bitrate (Kbps) | Live Bitrate (Kbps)| + * |------------------------|------------------|---------------------|--------------------| + * | 160 * 120 | 15 | 65 | 110 | + * | 120 * 120 | 15 | 50 | 90 | + * | 320 * 180 | 15 | 140 | 240 | + * | 180 * 180 | 15 | 100 | 160 | + * | 240 * 180 | 15 | 120 | 200 | + * | 320 * 240 | 15 | 200 | 300 | + * | 240 * 240 | 15 | 140 | 240 | + * | 424 * 240 | 15 | 220 | 370 | + * | 640 * 360 | 15 | 400 | 680 | + * | 360 * 360 | 15 | 260 | 440 | + * | 640 * 360 | 30 | 600 | 1030 | + * | 360 * 360 | 30 | 400 | 670 | + * | 480 * 360 | 15 | 320 | 550 | + * | 480 * 360 | 30 | 490 | 830 | + * | 640 * 480 | 15 | 500 | 750 | + * | 480 * 480 | 15 | 400 | 680 | + * | 640 * 480 | 30 | 750 | 1130 | + * | 480 * 480 | 30 | 600 | 1030 | + * | 848 * 480 | 15 | 610 | 920 | + * | 848 * 480 | 30 | 930 | 1400 | + * | 640 * 480 | 10 | 400 | 600 | + * | 960 * 540 | 15 | 750 | 1100 | + * | 960 * 540 | 30 | 1110 | 1670 | + * | 1280 * 720 | 15 | 1130 | 1600 | + * | 1280 * 720 | 30 | 1710 | 2400 | + * | 960 * 720 | 15 | 910 | 1280 | + * | 960 * 720 | 30 | 1380 | 2000 | + * | 1920 * 1080 | 15 | 2080 | 2500 | + * | 1920 * 1080 | 30 | 3150 | 3780 | + * | 1920 * 1080 | 60 | 4780 | 5730 | + * | 2560 * 1440 | 30 | 4850 | 4850 | + * | 2560 * 1440 | 60 | 7350 | 7350 | + * | 3840 * 2160 | 30 | 8910 | 8910 | + * | 3840 * 2160 | 60 | 13500 | 13500 | + */ + int bitrate; + + /** + * The minimum encoding bitrate (Kbps). + * + * The Agora SDK automatically adjusts the encoding bitrate to adapt to the + * network conditions. + * + * Using a value greater than the default value forces the video encoder to + * output high-quality images but may cause more packet loss and hence + * sacrifice the smoothness of the video transmission. That said, unless you + * have special requirements for image quality, Agora does not recommend + * changing this value. + * + * @note + * This parameter applies to the live-broadcast profile only. + */ + int minBitrate; + /** + * The video orientation mode: #ORIENTATION_MODE. + */ + ORIENTATION_MODE orientationMode; + /** + * The video degradation preference under limited bandwidth: #DEGRADATION_PREFERENCE. + */ + DEGRADATION_PREFERENCE degradationPreference; + + /** + * The mirror mode is disabled by default + * If mirror_type is set to VIDEO_MIRROR_MODE_ENABLED, then the video frame would be mirrored before encoding. + */ + VIDEO_MIRROR_MODE_TYPE mirrorMode; + + /** + * The advanced options for the video encoder configuration. See AdvanceOptions. + */ + AdvanceOptions advanceOptions; + + VideoEncoderConfiguration(const VideoDimensions& d, int f, int b, ORIENTATION_MODE m, VIDEO_MIRROR_MODE_TYPE mirror = VIDEO_MIRROR_MODE_DISABLED) + : codecType(VIDEO_CODEC_H265), + dimensions(d), + frameRate(f), + bitrate(b), + minBitrate(DEFAULT_MIN_BITRATE), + orientationMode(m), + degradationPreference(MAINTAIN_QUALITY), + mirrorMode(mirror), + advanceOptions(PREFER_AUTO, PREFER_LOW_LATENCY) {} + VideoEncoderConfiguration(int width, int height, int f, int b, ORIENTATION_MODE m, VIDEO_MIRROR_MODE_TYPE mirror = VIDEO_MIRROR_MODE_DISABLED) + : codecType(VIDEO_CODEC_H265), + dimensions(width, height), + frameRate(f), + bitrate(b), + minBitrate(DEFAULT_MIN_BITRATE), + orientationMode(m), + degradationPreference(MAINTAIN_QUALITY), + mirrorMode(mirror), + advanceOptions(PREFER_AUTO, PREFER_LOW_LATENCY) {} + VideoEncoderConfiguration(const VideoEncoderConfiguration& config) + : codecType(config.codecType), + dimensions(config.dimensions), + frameRate(config.frameRate), + bitrate(config.bitrate), + minBitrate(config.minBitrate), + orientationMode(config.orientationMode), + degradationPreference(config.degradationPreference), + mirrorMode(config.mirrorMode), + advanceOptions(config.advanceOptions) {} + VideoEncoderConfiguration() + : codecType(VIDEO_CODEC_H265), + dimensions(FRAME_WIDTH_960, FRAME_HEIGHT_540), + frameRate(FRAME_RATE_FPS_15), + bitrate(STANDARD_BITRATE), + minBitrate(DEFAULT_MIN_BITRATE), + orientationMode(ORIENTATION_MODE_ADAPTIVE), + degradationPreference(MAINTAIN_QUALITY), + mirrorMode(VIDEO_MIRROR_MODE_DISABLED), + advanceOptions(PREFER_AUTO, PREFER_LOW_LATENCY) {} + + VideoEncoderConfiguration& operator=(const VideoEncoderConfiguration& rhs) { + if (this == &rhs) return *this; + codecType = rhs.codecType; + dimensions = rhs.dimensions; + frameRate = rhs.frameRate; + bitrate = rhs.bitrate; + minBitrate = rhs.minBitrate; + orientationMode = rhs.orientationMode; + degradationPreference = rhs.degradationPreference; + mirrorMode = rhs.mirrorMode; + advanceOptions = rhs.advanceOptions; + return *this; + } +}; + +/** + * The configurations for the data stream. + */ +struct DataStreamConfig { + /** + * Whether to synchronize the data packet with the published audio packet. + * - `true`: Synchronize the data packet with the audio packet. + * - `false`: Do not synchronize the data packet with the audio packet. + * + * When you set the data packet to synchronize with the audio, then if the data packet delay is + * within the audio delay, the SDK triggers the `onStreamMessage` callback when the synchronized + * audio packet is played out. Do not set this parameter as true if you need the receiver to receive + * the data packet immediately. Agora recommends that you set this parameter to `true` only when you + * need to implement specific functions, for example lyric synchronization. + */ + bool syncWithAudio; + /** + * Whether the SDK guarantees that the receiver receives the data in the sent order. + * - `true`: Guarantee that the receiver receives the data in the sent order. + * - `false`: Do not guarantee that the receiver receives the data in the sent order. + * + * Do not set this parameter as `true` if you need the receiver to receive the data packet immediately. + */ + bool ordered; +}; + +/** + * The definition of SIMULCAST_STREAM_MODE + */ +enum SIMULCAST_STREAM_MODE { + /* + * disable simulcast stream until receive request for enable simulcast stream by other broadcaster + */ + AUTO_SIMULCAST_STREAM = -1, + /* + * disable simulcast stream + */ + DISABLE_SIMULCAST_STREAM = 0, + /* + * always enable simulcast stream + */ + ENABLE_SIMULCAST_STREAM = 1, +}; + +/** + * The configuration of the low-quality video stream. + */ +struct SimulcastStreamConfig { + /** + * The video frame dimension: VideoDimensions. The default value is 160 脳 120. + */ + VideoDimensions dimensions; + /** + * The video bitrate (Kbps), represented by an instantaneous value. The default value of the log level is 5. + */ + int kBitrate; + /** + * he capture frame rate (fps) of the local video. The default value is 5. + */ + int framerate; + SimulcastStreamConfig() : dimensions(160, 120), kBitrate(65), framerate(5) {} + bool operator==(const SimulcastStreamConfig& rhs) const { + return dimensions == rhs.dimensions && kBitrate == rhs.kBitrate && framerate == rhs.framerate; + } +}; + +/** + * The location of the target area relative to the screen or window. If you do not set this parameter, + * the SDK selects the whole screen or window. + */ +struct Rectangle { + /** + * The horizontal offset from the top-left corner. + */ + int x; + /** + * The vertical offset from the top-left corner. + */ + int y; + /** + * The width of the region. + */ + int width; + /** + * The height of the region. + */ + int height; + + Rectangle() : x(0), y(0), width(0), height(0) {} + Rectangle(int xx, int yy, int ww, int hh) : x(xx), y(yy), width(ww), height(hh) {} +}; + +/** + * The position and size of the watermark on the screen. + * + * The position and size of the watermark on the screen are determined by `xRatio`, `yRatio`, and `widthRatio`: + * - (`xRatio`, `yRatio`) refers to the coordinates of the upper left corner of the watermark, which determines + * the distance from the upper left corner of the watermark to the upper left corner of the screen. + * The `widthRatio` determines the width of the watermark. + */ +struct WatermarkRatio { + /** + * The x-coordinate of the upper left corner of the watermark. The horizontal position relative to + * the origin, where the upper left corner of the screen is the origin, and the x-coordinate is the + * upper left corner of the watermark. The value range is [0.0,1.0], and the default value is 0. + */ + float xRatio; + /** + * The y-coordinate of the upper left corner of the watermark. The vertical position relative to the + * origin, where the upper left corner of the screen is the origin, and the y-coordinate is the upper + * left corner of the screen. The value range is [0.0,1.0], and the default value is 0. + */ + float yRatio; + /** + * The width of the watermark. The SDK calculates the height of the watermark proportionally according + * to this parameter value to ensure that the enlarged or reduced watermark image is not distorted. + * The value range is [0,1], and the default value is 0, which means no watermark is displayed. + */ + float widthRatio; + + WatermarkRatio() : xRatio(0.0), yRatio(0.0), widthRatio(0.0) {} + WatermarkRatio(float x, float y, float width) : xRatio(x), yRatio(y), widthRatio(width) {} +}; + +/** + * Configurations of the watermark image. + */ +struct WatermarkOptions { + /** + * Whether or not the watermark image is visible in the local video preview: + * - true: (Default) The watermark image is visible in preview. + * - false: The watermark image is not visible in preview. + */ + bool visibleInPreview; + /** + * When the adaptation mode of the watermark is `FIT_MODE_COVER_POSITION`, it is used to set the + * area of the watermark image in landscape mode. See #FIT_MODE_COVER_POSITION for details. + */ + Rectangle positionInLandscapeMode; + /** + * When the adaptation mode of the watermark is `FIT_MODE_COVER_POSITION`, it is used to set the + * area of the watermark image in portrait mode. See #FIT_MODE_COVER_POSITION for details. + */ + Rectangle positionInPortraitMode; + /** + * When the watermark adaptation mode is `FIT_MODE_USE_IMAGE_RATIO`, this parameter is used to set + * the watermark coordinates. See WatermarkRatio for details. + */ + WatermarkRatio watermarkRatio; + /** + * The adaptation mode of the watermark. See #WATERMARK_FIT_MODE for details. + */ + WATERMARK_FIT_MODE mode; + + WatermarkOptions() + : visibleInPreview(true), + positionInLandscapeMode(0, 0, 0, 0), + positionInPortraitMode(0, 0, 0, 0), + mode(FIT_MODE_COVER_POSITION) {} +}; + +/** + * The definition of the RtcStats struct. + */ +struct RtcStats { + /** + * The call duration (s), represented by an aggregate value. + */ + unsigned int duration; + /** + * The total number of bytes transmitted, represented by an aggregate value. + */ + unsigned int txBytes; + /** + * The total number of bytes received, represented by an aggregate value. + */ + unsigned int rxBytes; + /** + * The total number of audio bytes sent (bytes), represented by an aggregate value. + */ + unsigned int txAudioBytes; + /** + * The total number of video bytes sent (bytes), represented by an aggregate value. + */ + unsigned int txVideoBytes; + /** + * The total number of audio bytes received (bytes), represented by an aggregate value. + */ + unsigned int rxAudioBytes; + /** + * The total number of video bytes received (bytes), represented by an aggregate value. + */ + unsigned int rxVideoBytes; + /** + * The transmission bitrate (Kbps), represented by an instantaneous value. + */ + unsigned short txKBitRate; + /** + * The receiving bitrate (Kbps), represented by an instantaneous value. + */ + unsigned short rxKBitRate; + /** + * Audio receiving bitrate (Kbps), represented by an instantaneous value. + */ + unsigned short rxAudioKBitRate; + /** + * The audio transmission bitrate (Kbps), represented by an instantaneous value. + */ + unsigned short txAudioKBitRate; + /** + * The video receive bitrate (Kbps), represented by an instantaneous value. + */ + unsigned short rxVideoKBitRate; + /** + * The video transmission bitrate (Kbps), represented by an instantaneous value. + */ + unsigned short txVideoKBitRate; + /** + * The VOS client-server latency (ms). + */ + unsigned short lastmileDelay; + /** + * The number of users in the channel. + */ + unsigned int userCount; + /** + * The app CPU usage (%). + * @note + * - The value of `cpuAppUsage` is always reported as 0 in the `onLeaveChannel` callback. + * - As of Android 8.1, you cannot get the CPU usage from this attribute due to system limitations. + */ + double cpuAppUsage; + /** + * The system CPU usage (%). + * + * For Windows, in the multi-kernel environment, this member represents the average CPU usage. The + * value = (100 - System Idle Progress in Task Manager)/100. + * @note + * - The value of `cpuTotalUsage` is always reported as 0 in the `onLeaveChannel` callback. + * - As of Android 8.1, you cannot get the CPU usage from this attribute due to system limitations. + */ + double cpuTotalUsage; + /** + * The round-trip time delay from the client to the local router. + * @note On Android, to get `gatewayRtt`, ensure that you add the `android.permission.ACCESS_WIFI_STATE` + * permission after `` in the `AndroidManifest.xml` file in your project. + */ + int gatewayRtt; + /** + * The memory usage ratio of the app (%). + * @note This value is for reference only. Due to system limitations, you may not get this value. + */ + double memoryAppUsageRatio; + /** + * The memory usage ratio of the system (%). + * @note This value is for reference only. Due to system limitations, you may not get this value. + */ + double memoryTotalUsageRatio; + /** + * The memory usage of the app (KB). + * @note This value is for reference only. Due to system limitations, you may not get this value. + */ + int memoryAppUsageInKbytes; + /** + * The time elapsed from the when the app starts connecting to an Agora channel + * to when the connection is established. 0 indicates that this member does not apply. + */ + int connectTimeMs; + /** + * The duration (ms) between the app starting connecting to an Agora channel + * and the first audio packet is received. 0 indicates that this member does not apply. + */ + int firstAudioPacketDuration; + /** + * The duration (ms) between the app starting connecting to an Agora channel + * and the first video packet is received. 0 indicates that this member does not apply. + */ + int firstVideoPacketDuration; + /** + * The duration (ms) between the app starting connecting to an Agora channel + * and the first video key frame is received. 0 indicates that this member does not apply. + */ + int firstVideoKeyFramePacketDuration; + /** + * The number of video packets before the first video key frame is received. + * 0 indicates that this member does not apply. + */ + int packetsBeforeFirstKeyFramePacket; + /** + * The duration (ms) between the last time unmute audio and the first audio packet is received. + * 0 indicates that this member does not apply. + */ + int firstAudioPacketDurationAfterUnmute; + /** + * The duration (ms) between the last time unmute video and the first video packet is received. + * 0 indicates that this member does not apply. + */ + int firstVideoPacketDurationAfterUnmute; + /** + * The duration (ms) between the last time unmute video and the first video key frame is received. + * 0 indicates that this member does not apply. + */ + int firstVideoKeyFramePacketDurationAfterUnmute; + /** + * The duration (ms) between the last time unmute video and the first video key frame is decoded. + * 0 indicates that this member does not apply. + */ + int firstVideoKeyFrameDecodedDurationAfterUnmute; + /** + * The duration (ms) between the last time unmute video and the first video key frame is rendered. + * 0 indicates that this member does not apply. + */ + int firstVideoKeyFrameRenderedDurationAfterUnmute; + /** + * The packet loss rate of sender(broadcaster). + */ + int txPacketLossRate; + /** + * The packet loss rate of receiver(audience). + */ + int rxPacketLossRate; + RtcStats() + : duration(0), + txBytes(0), + rxBytes(0), + txAudioBytes(0), + txVideoBytes(0), + rxAudioBytes(0), + rxVideoBytes(0), + txKBitRate(0), + rxKBitRate(0), + rxAudioKBitRate(0), + txAudioKBitRate(0), + rxVideoKBitRate(0), + txVideoKBitRate(0), + lastmileDelay(0), + userCount(0), + cpuAppUsage(0.0), + cpuTotalUsage(0.0), + gatewayRtt(0), + memoryAppUsageRatio(0.0), + memoryTotalUsageRatio(0.0), + memoryAppUsageInKbytes(0), + connectTimeMs(0), + firstAudioPacketDuration(0), + firstVideoPacketDuration(0), + firstVideoKeyFramePacketDuration(0), + packetsBeforeFirstKeyFramePacket(0), + firstAudioPacketDurationAfterUnmute(0), + firstVideoPacketDurationAfterUnmute(0), + firstVideoKeyFramePacketDurationAfterUnmute(0), + firstVideoKeyFrameDecodedDurationAfterUnmute(0), + firstVideoKeyFrameRenderedDurationAfterUnmute(0), + txPacketLossRate(0), + rxPacketLossRate(0) {} +}; + +/** + * User role types. + */ +enum CLIENT_ROLE_TYPE { + /** + * 1: Broadcaster. A broadcaster can both send and receive streams. + */ + CLIENT_ROLE_BROADCASTER = 1, + /** + * 2: Audience. An audience member can only receive streams. + */ + CLIENT_ROLE_AUDIENCE = 2, +}; + +/** + * Quality change of the local video in terms of target frame rate and target bit rate since last count. + */ +enum QUALITY_ADAPT_INDICATION { + /** + * 0: The quality of the local video stays the same. + */ + ADAPT_NONE = 0, + /** + * 1: The quality improves because the network bandwidth increases. + */ + ADAPT_UP_BANDWIDTH = 1, + /** + * 2: The quality worsens because the network bandwidth decreases. + */ + ADAPT_DOWN_BANDWIDTH = 2, +}; + +/** + * The latency level of an audience member in interactive live streaming. This enum takes effect only + * when the user role is set to `CLIENT_ROLE_AUDIENCE`. + */ +enum AUDIENCE_LATENCY_LEVEL_TYPE +{ + /** + * 1: Low latency. + */ + AUDIENCE_LATENCY_LEVEL_LOW_LATENCY = 1, + /** + * 2: Ultra low latency. + */ + AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY = 2, +}; + +/** + * The detailed options of a user. + */ +struct ClientRoleOptions +{ + /** + * The latency level of an audience member in interactive live streaming. See `AUDIENCE_LATENCY_LEVEL_TYPE`. + */ + AUDIENCE_LATENCY_LEVEL_TYPE audienceLatencyLevel; + + ClientRoleOptions() + : audienceLatencyLevel(AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY) {} +}; + +/** + * Quality of experience (QoE) of the local user when receiving a remote audio stream. + */ +enum EXPERIENCE_QUALITY_TYPE { + /** 0: QoE of the local user is good. */ + EXPERIENCE_QUALITY_GOOD = 0, + /** 1: QoE of the local user is poor. */ + EXPERIENCE_QUALITY_BAD = 1, +}; + +/** + * Reasons why the QoE of the local user when receiving a remote audio stream is poor. + */ +enum EXPERIENCE_POOR_REASON { + /** + * 0: No reason, indicating good QoE of the local user. + */ + EXPERIENCE_REASON_NONE = 0, + /** + * 1: The remote user's network quality is poor. + */ + REMOTE_NETWORK_QUALITY_POOR = 1, + /** + * 2: The local user's network quality is poor. + */ + LOCAL_NETWORK_QUALITY_POOR = 2, + /** + * 4: The local user's Wi-Fi or mobile network signal is weak. + */ + WIRELESS_SIGNAL_POOR = 4, + /** + * 8: The local user enables both Wi-Fi and bluetooth, and their signals interfere with each other. + * As a result, audio transmission quality is undermined. + */ + WIFI_BLUETOOTH_COEXIST = 8, +}; + +/** + * Audio AINS mode + */ +enum AUDIO_AINS_MODE { + /** + * AINS mode with soft suppression level. + */ + AINS_MODE_BALANCED = 0, + /** + * AINS mode with high suppression level. + */ + AINS_MODE_AGGRESSIVE = 1, + /** + * AINS mode with high suppression level and ultra-low-latency + */ + AINS_MODE_ULTRALOWLATENCY = 2 +}; + +/** + * Audio profile types. + */ +enum AUDIO_PROFILE_TYPE { + /** + * 0: The default audio profile. + * - For the Communication profile: + * - Windows: A sample rate of 16 kHz, audio encoding, mono, and a bitrate of up to 16 Kbps. + * - Android/macOS/iOS: A sample rate of 32 kHz, audio encoding, mono, and a bitrate of up to 18 Kbps. + * of up to 16 Kbps. + * - For the Live-broadcast profile: A sample rate of 48 kHz, music encoding, mono, and a bitrate of up to 64 Kbps. + */ + AUDIO_PROFILE_DEFAULT = 0, + /** + * 1: A sample rate of 32 kHz, audio encoding, mono, and a bitrate of up to 18 Kbps. + */ + AUDIO_PROFILE_SPEECH_STANDARD = 1, + /** + * 2: A sample rate of 48 kHz, music encoding, mono, and a bitrate of up to 64 Kbps. + */ + AUDIO_PROFILE_MUSIC_STANDARD = 2, + /** + * 3: A sample rate of 48 kHz, music encoding, stereo, and a bitrate of up to 80 Kbps. + * + * To implement stereo audio, you also need to call `setAdvancedAudioOptions` and set `audioProcessingChannels` + * to `AUDIO_PROCESSING_STEREO` in `AdvancedAudioOptions`. + */ + AUDIO_PROFILE_MUSIC_STANDARD_STEREO = 3, + /** + * 4: A sample rate of 48 kHz, music encoding, mono, and a bitrate of up to 96 Kbps. + */ + AUDIO_PROFILE_MUSIC_HIGH_QUALITY = 4, + /** + * 5: A sample rate of 48 kHz, music encoding, stereo, and a bitrate of up to 128 Kbps. + * + * To implement stereo audio, you also need to call `setAdvancedAudioOptions` and set `audioProcessingChannels` + * to `AUDIO_PROCESSING_STEREO` in `AdvancedAudioOptions`. + */ + AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO = 5, + /** + * 6: A sample rate of 16 kHz, audio encoding, mono, and Acoustic Echo Cancellation (AES) enabled. + */ + AUDIO_PROFILE_IOT = 6, + AUDIO_PROFILE_NUM = 7 +}; + +/** + * The audio scenario. + */ +enum AUDIO_SCENARIO_TYPE { + /** + * 0: Automatic scenario, where the SDK chooses the appropriate audio quality according to the + * user role and audio route. + */ + AUDIO_SCENARIO_DEFAULT = 0, + /** + * 3: (Recommended) The live gaming scenario, which needs to enable gaming + * audio effects in the speaker. Choose this scenario to achieve high-fidelity + * music playback. + */ + AUDIO_SCENARIO_GAME_STREAMING = 3, + /** + * 5: The chatroom scenario, which needs to keep recording when setClientRole to audience. + * Normally, app developer can also use mute api to achieve the same result, + * and we implement this 'non-orthogonal' behavior only to make API backward compatible. + */ + AUDIO_SCENARIO_CHATROOM = 5, + /** + * 7: Real-time chorus scenario, where users have good network conditions and require ultra-low latency. + */ + AUDIO_SCENARIO_CHORUS = 7, + /** + * 8: Meeting + */ + AUDIO_SCENARIO_MEETING = 8, + /** + * 9: The number of enumerations. + */ + AUDIO_SCENARIO_NUM = 9, +}; + +/** + * The format of the video frame. + */ +struct VideoFormat { + OPTIONAL_ENUM_SIZE_T { + /** The maximum value (px) of the width. */ + kMaxWidthInPixels = 3840, + /** The maximum value (px) of the height. */ + kMaxHeightInPixels = 2160, + /** The maximum value (fps) of the frame rate. */ + kMaxFps = 60, + }; + + /** + * The width (px) of the video. + */ + int width; // Number of pixels. + /** + * The height (px) of the video. + */ + int height; // Number of pixels. + /** + * The video frame rate (fps). + */ + int fps; + VideoFormat() : width(FRAME_WIDTH_960), height(FRAME_HEIGHT_540), fps(FRAME_RATE_FPS_15) {} + VideoFormat(int w, int h, int f) : width(w), height(h), fps(f) {} + + bool operator<(const VideoFormat& fmt) const { + if (height != fmt.height) { + return height < fmt.height; + } else if (width != fmt.width) { + return width < fmt.width; + } else { + return fps < fmt.fps; + } + } + bool operator==(const VideoFormat& fmt) const { + return width == fmt.width && height == fmt.height && fps == fmt.fps; + } + bool operator!=(const VideoFormat& fmt) const { + return !operator==(fmt); + } +}; + +/** + * Video content hints. + */ +enum VIDEO_CONTENT_HINT { + /** + * (Default) No content hint. In this case, the SDK balances smoothness with sharpness. + */ + CONTENT_HINT_NONE, + /** + * Choose this option if you prefer smoothness or when + * you are sharing motion-intensive content such as a video clip, movie, or video game. + * + * + */ + CONTENT_HINT_MOTION, + /** + * Choose this option if you prefer sharpness or when you are + * sharing montionless content such as a picture, PowerPoint slide, ot text. + * + */ + CONTENT_HINT_DETAILS +}; +/** + * The screen sharing scenario. + */ +enum SCREEN_SCENARIO_TYPE { + /** + * 1: Document. This scenario prioritizes the video quality of screen sharing and reduces the + * latency of the shared video for the receiver. If you share documents, slides, and tables, + * you can set this scenario. + */ + SCREEN_SCENARIO_DOCUMENT = 1, + /** + * 2: Game. This scenario prioritizes the smoothness of screen sharing. If you share games, you + * can set this scenario. + */ + SCREEN_SCENARIO_GAMING = 2, + /** + * 3: Video. This scenario prioritizes the smoothness of screen sharing. If you share movies or + * live videos, you can set this scenario. + */ + SCREEN_SCENARIO_VIDEO = 3, + /** + * 4: Remote control. This scenario prioritizes the video quality of screen sharing and reduces + * the latency of the shared video for the receiver. If you share the device desktop being + * remotely controlled, you can set this scenario. + */ + SCREEN_SCENARIO_RDC = 4, +}; + + +/** + * The video application scenario type. + */ +enum VIDEO_APPLICATION_SCENARIO_TYPE { + /** + * 0: Default Scenario. + */ + APPLICATION_SCENARIO_GENERAL = 0, + /** + * 1: Meeting Scenario. This scenario is the best QoE practice of meeting application. + */ + APPLICATION_SCENARIO_MEETING = 1, +}; + +/** + * The video QoE preference type. + */ +enum VIDEO_QOE_PREFERENCE_TYPE { + /** + * 1: Default QoE type, balance the delay, picture quality and fluency. + */ + VIDEO_QOE_PREFERENCE_BALANCE = 1, + /** + * 2: lower the e2e delay. + */ + VIDEO_QOE_PREFERENCE_DELAY_FIRST = 2, + /** + * 3: picture quality. + */ + VIDEO_QOE_PREFERENCE_PICTURE_QUALITY_FIRST = 3, + /** + * 4: more fluency. + */ + VIDEO_QOE_PREFERENCE_FLUENCY_FIRST = 4, + +}; + +/** + * The brightness level of the video image captured by the local camera. + */ +enum CAPTURE_BRIGHTNESS_LEVEL_TYPE { + /** -1: The SDK does not detect the brightness level of the video image. + * Wait a few seconds to get the brightness level from `CAPTURE_BRIGHTNESS_LEVEL_TYPE` in the next callback. + */ + CAPTURE_BRIGHTNESS_LEVEL_INVALID = -1, + /** 0: The brightness level of the video image is normal. + */ + CAPTURE_BRIGHTNESS_LEVEL_NORMAL = 0, + /** 1: The brightness level of the video image is too bright. + */ + CAPTURE_BRIGHTNESS_LEVEL_BRIGHT = 1, + /** 2: The brightness level of the video image is too dark. + */ + CAPTURE_BRIGHTNESS_LEVEL_DARK = 2, +}; + +/** + * Local audio states. + */ +enum LOCAL_AUDIO_STREAM_STATE { + /** + * 0: The local audio is in the initial state. + */ + LOCAL_AUDIO_STREAM_STATE_STOPPED = 0, + /** + * 1: The capturing device starts successfully. + */ + LOCAL_AUDIO_STREAM_STATE_RECORDING = 1, + /** + * 2: The first audio frame encodes successfully. + */ + LOCAL_AUDIO_STREAM_STATE_ENCODING = 2, + /** + * 3: The local audio fails to start. + */ + LOCAL_AUDIO_STREAM_STATE_FAILED = 3 +}; + +/** + * Local audio state error codes. + */ +enum LOCAL_AUDIO_STREAM_REASON { + /** + * 0: The local audio is normal. + */ + LOCAL_AUDIO_STREAM_REASON_OK = 0, + /** + * 1: No specified reason for the local audio failure. Remind your users to try to rejoin the channel. + */ + LOCAL_AUDIO_STREAM_REASON_FAILURE = 1, + /** + * 2: No permission to use the local audio device. Remind your users to grant permission. + */ + LOCAL_AUDIO_STREAM_REASON_DEVICE_NO_PERMISSION = 2, + /** + * 3: (Android and iOS only) The local audio capture device is used. Remind your users to check + * whether another application occupies the microphone. Local audio capture automatically resume + * after the microphone is idle for about five seconds. You can also try to rejoin the channel + * after the microphone is idle. + */ + LOCAL_AUDIO_STREAM_REASON_DEVICE_BUSY = 3, + /** + * 4: The local audio capture failed. + */ + LOCAL_AUDIO_STREAM_REASON_RECORD_FAILURE = 4, + /** + * 5: The local audio encoding failed. + */ + LOCAL_AUDIO_STREAM_REASON_ENCODE_FAILURE = 5, + /** 6: The SDK cannot find the local audio recording device. + */ + LOCAL_AUDIO_STREAM_REASON_NO_RECORDING_DEVICE = 6, + /** 7: The SDK cannot find the local audio playback device. + */ + LOCAL_AUDIO_STREAM_REASON_NO_PLAYOUT_DEVICE = 7, + /** + * 8: The local audio capturing is interrupted by the system call. + */ + LOCAL_AUDIO_STREAM_REASON_INTERRUPTED = 8, + /** 9: An invalid audio capture device ID. + */ + LOCAL_AUDIO_STREAM_REASON_RECORD_INVALID_ID = 9, + /** 10: An invalid audio playback device ID. + */ + LOCAL_AUDIO_STREAM_REASON_PLAYOUT_INVALID_ID = 10, +}; + +/** Local video state types. + */ +enum LOCAL_VIDEO_STREAM_STATE { + /** + * 0: The local video is in the initial state. + */ + LOCAL_VIDEO_STREAM_STATE_STOPPED = 0, + /** + * 1: The local video capturing device starts successfully. The SDK also reports this state when + * you call `startScreenCaptureByWindowId` to share a maximized window. + */ + LOCAL_VIDEO_STREAM_STATE_CAPTURING = 1, + /** + * 2: The first video frame is successfully encoded. + */ + LOCAL_VIDEO_STREAM_STATE_ENCODING = 2, + /** + * 3: Fails to start the local video. + */ + LOCAL_VIDEO_STREAM_STATE_FAILED = 3 +}; + +/** + * Local video state error codes. + */ +enum LOCAL_VIDEO_STREAM_REASON { + /** + * 0: The local video is normal. + */ + LOCAL_VIDEO_STREAM_REASON_OK = 0, + /** + * 1: No specified reason for the local video failure. + */ + LOCAL_VIDEO_STREAM_REASON_FAILURE = 1, + /** + * 2: No permission to use the local video capturing device. Remind the user to grant permission + * and rejoin the channel. + */ + LOCAL_VIDEO_STREAM_REASON_DEVICE_NO_PERMISSION = 2, + /** + * 3: The local video capturing device is in use. Remind the user to check whether another + * application occupies the camera. + */ + LOCAL_VIDEO_STREAM_REASON_DEVICE_BUSY = 3, + /** + * 4: The local video capture fails. Remind the user to check whether the video capture device + * is working properly or the camera is occupied by another application, and then to rejoin the + * channel. + */ + LOCAL_VIDEO_STREAM_REASON_CAPTURE_FAILURE = 4, + /** + * 5: The local video encoder is not supported. + */ + LOCAL_VIDEO_STREAM_REASON_CODEC_NOT_SUPPORT = 5, + /** + * 6: (iOS only) The app is in the background. Remind the user that video capture cannot be + * performed normally when the app is in the background. + */ + LOCAL_VIDEO_STREAM_REASON_CAPTURE_INBACKGROUND = 6, + /** + * 7: (iOS only) The current application window is running in Slide Over, Split View, or Picture + * in Picture mode, and another app is occupying the camera. Remind the user that the application + * cannot capture video properly when the app is running in Slide Over, Split View, or Picture in + * Picture mode and another app is occupying the camera. + */ + LOCAL_VIDEO_STREAM_REASON_CAPTURE_MULTIPLE_FOREGROUND_APPS = 7, + /** + * 8: Fails to find a local video capture device. Remind the user to check whether the camera is + * connected to the device properly or the camera is working properly, and then to rejoin the + * channel. + */ + LOCAL_VIDEO_STREAM_REASON_DEVICE_NOT_FOUND = 8, + /** + * 9: (macOS only) The video capture device currently in use is disconnected (such as being + * unplugged). + */ + LOCAL_VIDEO_STREAM_REASON_DEVICE_DISCONNECTED = 9, + /** + * 10: (macOS and Windows only) The SDK cannot find the video device in the video device list. + * Check whether the ID of the video device is valid. + */ + LOCAL_VIDEO_STREAM_REASON_DEVICE_INVALID_ID = 10, + /** + * 101: The current video capture device is unavailable due to excessive system pressure. + */ + LOCAL_VIDEO_STREAM_REASON_DEVICE_SYSTEM_PRESSURE = 101, + /** + * 11: (macOS only) The shared window is minimized when you call `startScreenCaptureByWindowId` + * to share a window. The SDK cannot share a minimized window. You can cancel the minimization + * of this window at the application layer, for example by maximizing this window. + */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_MINIMIZED = 11, + /** + * 12: (macOS and Windows only) The error code indicates that a window shared by the window ID + * has been closed or a full-screen window shared by the window ID has exited full-screen mode. + * After exiting full-screen mode, remote users cannot see the shared window. To prevent remote + * users from seeing a black screen, Agora recommends that you immediately stop screen sharing. + * + * Common scenarios for reporting this error code: + * - When the local user closes the shared window, the SDK reports this error code. + * - The local user shows some slides in full-screen mode first, and then shares the windows of + * the slides. After the user exits full-screen mode, the SDK reports this error code. + * - The local user watches a web video or reads a web document in full-screen mode first, and + * then shares the window of the web video or document. After the user exits full-screen mode, + * the SDK reports this error code. + */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_CLOSED = 12, + /** 13: The local screen capture window is occluded. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_OCCLUDED = 13, + /** 20: The local screen capture window is not supported. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_NOT_SUPPORTED = 20, + /** 21: The screen capture fails. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_FAILURE = 21, + /** 22: No permision to capture screen. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_NO_PERMISSION = 22, + /** + * 24: (Windows Only) An unexpected error (possibly due to window block failure) occurs during the screen + * sharing process, resulting in performance degradation. However, the screen sharing process itself is + * functioning normally. + */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_AUTO_FALLBACK = 24, + /** 25: (Windows only) The local screen capture window is currently hidden and not visible on the desktop. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_HIDDEN = 25, + /** 26: (Windows only) The local screen capture window is recovered from its hidden state. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_RECOVER_FROM_HIDDEN = 26, + /** 27:(Windows only) The window is recovered from miniminzed */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_WINDOW_RECOVER_FROM_MINIMIZED = 27, + /** + * 28: The screen capture paused. + * + * Common scenarios for reporting this error code: + * - When the desktop switch to the secure desktop such as UAC dialog or the Winlogon desktop on + * Windows platform, the SDK reports this error code. + */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_PAUSED = 28, + /** 29: The screen capture is resumed. */ + LOCAL_VIDEO_STREAM_REASON_SCREEN_CAPTURE_RESUMED = 29, + +}; + +/** + * Remote audio states. + */ +enum REMOTE_AUDIO_STATE +{ + /** + * 0: The remote audio is in the default state. The SDK reports this state in the case of + * `REMOTE_AUDIO_REASON_LOCAL_MUTED(3)`, `REMOTE_AUDIO_REASON_REMOTE_MUTED(5)`, or + * `REMOTE_AUDIO_REASON_REMOTE_OFFLINE(7)`. + */ + REMOTE_AUDIO_STATE_STOPPED = 0, // Default state, audio is started or remote user disabled/muted audio stream + /** + * 1: The first remote audio packet is received. + */ + REMOTE_AUDIO_STATE_STARTING = 1, // The first audio frame packet has been received + /** + * 2: The remote audio stream is decoded and plays normally. The SDK reports this state in the case of + * `REMOTE_AUDIO_REASON_NETWORK_RECOVERY(2)`, `REMOTE_AUDIO_REASON_LOCAL_UNMUTED(4)`, or + * `REMOTE_AUDIO_REASON_REMOTE_UNMUTED(6)`. + */ + REMOTE_AUDIO_STATE_DECODING = 2, // The first remote audio frame has been decoded or fronzen state ends + /** + * 3: The remote audio is frozen. The SDK reports this state in the case of + * `REMOTE_AUDIO_REASON_NETWORK_CONGESTION(1)`. + */ + REMOTE_AUDIO_STATE_FROZEN = 3, // Remote audio is frozen, probably due to network issue + /** + * 4: The remote audio fails to start. The SDK reports this state in the case of + * `REMOTE_AUDIO_REASON_INTERNAL(0)`. + */ + REMOTE_AUDIO_STATE_FAILED = 4, // Remote audio play failed +}; + +/** + * Reasons for the remote audio state change. + */ +enum REMOTE_AUDIO_STATE_REASON +{ + /** + * 0: The SDK reports this reason when the video state changes. + */ + REMOTE_AUDIO_REASON_INTERNAL = 0, + /** + * 1: Network congestion. + */ + REMOTE_AUDIO_REASON_NETWORK_CONGESTION = 1, + /** + * 2: Network recovery. + */ + REMOTE_AUDIO_REASON_NETWORK_RECOVERY = 2, + /** + * 3: The local user stops receiving the remote audio stream or + * disables the audio module. + */ + REMOTE_AUDIO_REASON_LOCAL_MUTED = 3, + /** + * 4: The local user resumes receiving the remote audio stream or + * enables the audio module. + */ + REMOTE_AUDIO_REASON_LOCAL_UNMUTED = 4, + /** + * 5: The remote user stops sending the audio stream or disables the + * audio module. + */ + REMOTE_AUDIO_REASON_REMOTE_MUTED = 5, + /** + * 6: The remote user resumes sending the audio stream or enables the + * audio module. + */ + REMOTE_AUDIO_REASON_REMOTE_UNMUTED = 6, + /** + * 7: The remote user leaves the channel. + */ + REMOTE_AUDIO_REASON_REMOTE_OFFLINE = 7, +}; + +/** + * The state of the remote video. + */ +enum REMOTE_VIDEO_STATE { + /** + * 0: The remote video is in the default state. The SDK reports this state in the case of + * `REMOTE_VIDEO_STATE_REASON_LOCAL_MUTED (3)`, `REMOTE_VIDEO_STATE_REASON_REMOTE_MUTED (5)`, + * `REMOTE_VIDEO_STATE_REASON_REMOTE_OFFLINE (7)`, or `REMOTE_VIDEO_STATE_REASON_AUDIO_FALLBACK (8)`. + */ + REMOTE_VIDEO_STATE_STOPPED = 0, + /** + * 1: The first remote video packet is received. + */ + REMOTE_VIDEO_STATE_STARTING = 1, + /** + * 2: The remote video stream is decoded and plays normally. The SDK reports this state in the case of + * `REMOTE_VIDEO_STATE_REASON_NETWORK_RECOVERY (2)`, `REMOTE_VIDEO_STATE_REASON_LOCAL_UNMUTED (4)`, + * `REMOTE_VIDEO_STATE_REASON_REMOTE_UNMUTED (6)`, or `REMOTE_VIDEO_STATE_REASON_AUDIO_FALLBACK_RECOVERY (9)`. + */ + REMOTE_VIDEO_STATE_DECODING = 2, + /** 3: The remote video is frozen, probably due to + * #REMOTE_VIDEO_STATE_REASON_NETWORK_CONGESTION (1). + */ + REMOTE_VIDEO_STATE_FROZEN = 3, + /** 4: The remote video fails to start. The SDK reports this state in the case of + * `REMOTE_VIDEO_STATE_REASON_INTERNAL (0)`. + */ + REMOTE_VIDEO_STATE_FAILED = 4, +}; +/** + * The reason for the remote video state change. + */ +enum REMOTE_VIDEO_STATE_REASON { + /** + * 0: The SDK reports this reason when the video state changes. + */ + REMOTE_VIDEO_STATE_REASON_INTERNAL = 0, + /** + * 1: Network congestion. + */ + REMOTE_VIDEO_STATE_REASON_NETWORK_CONGESTION = 1, + /** + * 2: Network recovery. + */ + REMOTE_VIDEO_STATE_REASON_NETWORK_RECOVERY = 2, + /** + * 3: The local user stops receiving the remote video stream or disables the video module. + */ + REMOTE_VIDEO_STATE_REASON_LOCAL_MUTED = 3, + /** + * 4: The local user resumes receiving the remote video stream or enables the video module. + */ + REMOTE_VIDEO_STATE_REASON_LOCAL_UNMUTED = 4, + /** + * 5: The remote user stops sending the video stream or disables the video module. + */ + REMOTE_VIDEO_STATE_REASON_REMOTE_MUTED = 5, + /** + * 6: The remote user resumes sending the video stream or enables the video module. + */ + REMOTE_VIDEO_STATE_REASON_REMOTE_UNMUTED = 6, + /** + * 7: The remote user leaves the channel. + */ + REMOTE_VIDEO_STATE_REASON_REMOTE_OFFLINE = 7, + /** 8: The remote audio-and-video stream falls back to the audio-only stream + * due to poor network conditions. + */ + REMOTE_VIDEO_STATE_REASON_AUDIO_FALLBACK = 8, + /** 9: The remote audio-only stream switches back to the audio-and-video + * stream after the network conditions improve. + */ + REMOTE_VIDEO_STATE_REASON_AUDIO_FALLBACK_RECOVERY = 9, + /** (Internal use only) 10: The remote video stream type change to low stream type + */ + REMOTE_VIDEO_STATE_REASON_VIDEO_STREAM_TYPE_CHANGE_TO_LOW = 10, + /** (Internal use only) 11: The remote video stream type change to high stream type + */ + REMOTE_VIDEO_STATE_REASON_VIDEO_STREAM_TYPE_CHANGE_TO_HIGH = 11, + /** (iOS only) 12: The app of the remote user is in background. + */ + REMOTE_VIDEO_STATE_REASON_SDK_IN_BACKGROUND = 12, + + /** 13: The remote video stream is not supported by the decoder + */ + REMOTE_VIDEO_STATE_REASON_CODEC_NOT_SUPPORT = 13, + +}; + +/** + * The remote user state information. + */ +enum REMOTE_USER_STATE { + /** + * The remote user has muted the audio. + */ + USER_STATE_MUTE_AUDIO = (1 << 0), + /** + * The remote user has muted the video. + */ + USER_STATE_MUTE_VIDEO = (1 << 1), + /** + * The remote user has enabled the video, which includes video capturing and encoding. + */ + USER_STATE_ENABLE_VIDEO = (1 << 4), + /** + * The remote user has enabled the local video capturing. + */ + USER_STATE_ENABLE_LOCAL_VIDEO = (1 << 8), +}; + +/** + * The definition of the VideoTrackInfo struct, which contains information of + * the video track. + */ +struct VideoTrackInfo { + VideoTrackInfo() + : isLocal(false), ownerUid(0), trackId(0), channelId(OPTIONAL_NULLPTR) + , streamType(VIDEO_STREAM_HIGH), codecType(VIDEO_CODEC_H265) + , encodedFrameOnly(false), sourceType(VIDEO_SOURCE_CAMERA_PRIMARY) + , observationPosition(agora::media::base::POSITION_POST_CAPTURER) {} + /** + * Whether the video track is local or remote. + * - true: The video track is local. + * - false: The video track is remote. + */ + bool isLocal; + /** + * ID of the user who publishes the video track. + */ + uid_t ownerUid; + /** + * ID of the video track. + */ + track_id_t trackId; + /** + * The channel ID of the video track. + */ + const char* channelId; + /** + * The video stream type: #VIDEO_STREAM_TYPE. + */ + VIDEO_STREAM_TYPE streamType; + /** + * The video codec type: #VIDEO_CODEC_TYPE. + */ + VIDEO_CODEC_TYPE codecType; + /** + * Whether the video track contains encoded video frame only. + * - true: The video track contains encoded video frame only. + * - false: The video track does not contain encoded video frame only. + */ + bool encodedFrameOnly; + /** + * The video source type: #VIDEO_SOURCE_TYPE + */ + VIDEO_SOURCE_TYPE sourceType; + /** + * the frame position for the video observer: #VIDEO_MODULE_POSITION + */ + uint32_t observationPosition; +}; + +/** + * The downscale level of the remote video stream . The higher the downscale level, the more the video downscales. + */ +enum REMOTE_VIDEO_DOWNSCALE_LEVEL { + /** + * No downscale. + */ + REMOTE_VIDEO_DOWNSCALE_LEVEL_NONE, + /** + * Downscale level 1. + */ + REMOTE_VIDEO_DOWNSCALE_LEVEL_1, + /** + * Downscale level 2. + */ + REMOTE_VIDEO_DOWNSCALE_LEVEL_2, + /** + * Downscale level 3. + */ + REMOTE_VIDEO_DOWNSCALE_LEVEL_3, + /** + * Downscale level 4. + */ + REMOTE_VIDEO_DOWNSCALE_LEVEL_4, +}; + +/** + * The volume information of users. + */ +struct AudioVolumeInfo { + /** + * User ID of the speaker. + * - In the local user's callback, `uid` = 0. + * - In the remote users' callback, `uid` is the user ID of a remote user whose instantaneous + * volume is one of the three highest. + */ + uid_t uid; + /** + * The volume of the user. The value ranges between 0 (the lowest volume) and 255 (the highest + * volume). If the user calls `startAudioMixing`, the value of volume is the volume after audio + * mixing. + */ + unsigned int volume; // [0,255] + /** + * Voice activity status of the local user. + * - 0: The local user is not speaking. + * - 1: The local user is speaking. + * @note + * - The `vad` parameter does not report the voice activity status of remote users. In a remote + * user's callback, the value of `vad` is always 1. + * - To use this parameter, you must set `reportVad` to true when calling `enableAudioVolumeIndication`. + */ + unsigned int vad; + /** + * The voice pitch (Hz) of the local user. The value ranges between 0.0 and 4000.0. + * @note The `voicePitch` parameter does not report the voice pitch of remote users. In the + * remote users' callback, the value of `voicePitch` is always 0.0. + */ + double voicePitch; + + AudioVolumeInfo() : uid(0), volume(0), vad(0), voicePitch(0.0) {} +}; + +/** + * The audio device information. + */ +struct DeviceInfo { + /* + * Whether the audio device supports ultra-low-latency capture and playback: + * - `true`: The device supports ultra-low-latency capture and playback. + * - `false`: The device does not support ultra-low-latency capture and playback. + */ + bool isLowLatencyAudioSupported; + + DeviceInfo() : isLowLatencyAudioSupported(false) {} +}; + +/** + * The definition of the IPacketObserver struct. + */ +class IPacketObserver { + public: + virtual ~IPacketObserver() {} + /** + * The definition of the Packet struct. + */ + struct Packet { + /** + * The buffer address of the sent or received data. + * @note Agora recommends setting `buffer` to a value larger than 2048 bytes. Otherwise, you + * may encounter undefined behaviors (such as crashes). + */ + const unsigned char* buffer; + /** + * The buffer size of the sent or received data. + */ + unsigned int size; + + Packet() : buffer(OPTIONAL_NULLPTR), size(0) {} + }; + /** + * Occurs when the SDK is ready to send the audio packet. + * @param packet The audio packet to be sent: Packet. + * @return Whether to send the audio packet: + * - true: Send the packet. + * - false: Do not send the packet, in which case the audio packet will be discarded. + */ + virtual bool onSendAudioPacket(Packet& packet) = 0; + /** + * Occurs when the SDK is ready to send the video packet. + * @param packet The video packet to be sent: Packet. + * @return Whether to send the video packet: + * - true: Send the packet. + * - false: Do not send the packet, in which case the audio packet will be discarded. + */ + virtual bool onSendVideoPacket(Packet& packet) = 0; + /** + * Occurs when the audio packet is received. + * @param packet The received audio packet: Packet. + * @return Whether to process the audio packet: + * - true: Process the packet. + * - false: Do not process the packet, in which case the audio packet will be discarded. + */ + virtual bool onReceiveAudioPacket(Packet& packet) = 0; + /** + * Occurs when the video packet is received. + * @param packet The received video packet: Packet. + * @return Whether to process the audio packet: + * - true: Process the packet. + * - false: Do not process the packet, in which case the video packet will be discarded. + */ + virtual bool onReceiveVideoPacket(Packet& packet) = 0; +}; + +/** + * Audio sample rate types. + */ +enum AUDIO_SAMPLE_RATE_TYPE { + /** + * 32000: 32 KHz. + */ + AUDIO_SAMPLE_RATE_32000 = 32000, + /** + * 44100: 44.1 KHz. + */ + AUDIO_SAMPLE_RATE_44100 = 44100, + /** + * 48000: 48 KHz. + */ + AUDIO_SAMPLE_RATE_48000 = 48000, +}; +/** + * The codec type of the output video. + */ +enum VIDEO_CODEC_TYPE_FOR_STREAM { + /** + * 1: H.264. + */ + VIDEO_CODEC_H264_FOR_STREAM = 1, + /** + * 2: H.265. + */ + VIDEO_CODEC_H265_FOR_STREAM = 2, +}; + +/** + * Video codec profile types. + */ +enum VIDEO_CODEC_PROFILE_TYPE { + /** + * 66: Baseline video codec profile. Generally used in video calls on mobile phones. + */ + VIDEO_CODEC_PROFILE_BASELINE = 66, + /** + * 77: Main video codec profile. Generally used in mainstream electronics, such as MP4 players, portable video players, PSP, and iPads. + */ + VIDEO_CODEC_PROFILE_MAIN = 77, + /** + * 100: High video codec profile. Generally used in high-resolution broadcasts or television. + */ + VIDEO_CODEC_PROFILE_HIGH = 100, +}; + + +/** + * Self-defined audio codec profile. + */ +enum AUDIO_CODEC_PROFILE_TYPE { + /** + * 0: LC-AAC. + */ + AUDIO_CODEC_PROFILE_LC_AAC = 0, + /** + * 1: HE-AAC. + */ + AUDIO_CODEC_PROFILE_HE_AAC = 1, + /** + * 2: HE-AAC v2. + */ + AUDIO_CODEC_PROFILE_HE_AAC_V2 = 2, +}; + +/** + * Local audio statistics. + */ +struct LocalAudioStats +{ + /** + * The number of audio channels. + */ + int numChannels; + /** + * The sampling rate (Hz) of sending the local user's audio stream. + */ + int sentSampleRate; + /** + * The average bitrate (Kbps) of sending the local user's audio stream. + */ + int sentBitrate; + /** + * The internal payload codec. + */ + int internalCodec; + /** + * The packet loss rate (%) from the local client to the Agora server before applying the anti-packet loss strategies. + */ + unsigned short txPacketLossRate; + /** + * The audio delay of the device, contains record and playout delay + */ + int audioDeviceDelay; + /** + * The playout delay of the device + */ + int audioPlayoutDelay; + /** + * The signal delay estimated from audio in-ear monitoring (ms). + */ + int earMonitorDelay; + /** + * The signal delay estimated during the AEC process from nearin and farin (ms). + */ + int aecEstimatedDelay; +}; + + +/** + * States of the Media Push. + */ +enum RTMP_STREAM_PUBLISH_STATE { + /** + * 0: The Media Push has not started or has ended. This state is also triggered after you remove a RTMP or RTMPS stream from the CDN by calling `removePublishStreamUrl`. + */ + RTMP_STREAM_PUBLISH_STATE_IDLE = 0, + /** + * 1: The SDK is connecting to Agora's streaming server and the CDN server. This state is triggered after you call the `addPublishStreamUrl` method. + */ + RTMP_STREAM_PUBLISH_STATE_CONNECTING = 1, + /** + * 2: The RTMP or RTMPS streaming publishes. The SDK successfully publishes the RTMP or RTMPS streaming and returns this state. + */ + RTMP_STREAM_PUBLISH_STATE_RUNNING = 2, + /** + * 3: The RTMP or RTMPS streaming is recovering. When exceptions occur to the CDN, or the streaming is interrupted, the SDK tries to resume RTMP or RTMPS streaming and returns this state. + * - If the SDK successfully resumes the streaming, #RTMP_STREAM_PUBLISH_STATE_RUNNING (2) returns. + * - If the streaming does not resume within 60 seconds or server errors occur, #RTMP_STREAM_PUBLISH_STATE_FAILURE (4) returns. You can also reconnect to the server by calling the `removePublishStreamUrl` and `addPublishStreamUrl` methods. + */ + RTMP_STREAM_PUBLISH_STATE_RECOVERING = 3, + /** + * 4: The RTMP or RTMPS streaming fails. See the `errCode` parameter for the detailed error information. You can also call the `addPublishStreamUrl` method to publish the RTMP or RTMPS streaming again. + */ + RTMP_STREAM_PUBLISH_STATE_FAILURE = 4, + /** + * 5: The SDK is disconnecting to Agora's streaming server and the CDN server. This state is triggered after you call the `removePublishStreamUrl` method. + */ + RTMP_STREAM_PUBLISH_STATE_DISCONNECTING = 5, +}; + +/** + * Error codes of the RTMP or RTMPS streaming. + */ +enum RTMP_STREAM_PUBLISH_REASON { + /** + * 0: The RTMP or RTMPS streaming publishes successfully. + */ + RTMP_STREAM_PUBLISH_REASON_OK = 0, + /** + * 1: Invalid argument used. If, for example, you do not call the `setLiveTranscoding` method to configure the LiveTranscoding parameters before calling the addPublishStreamUrl method, + * the SDK returns this error. Check whether you set the parameters in the `setLiveTranscoding` method properly. + */ + RTMP_STREAM_PUBLISH_REASON_INVALID_ARGUMENT = 1, + /** + * 2: The RTMP or RTMPS streaming is encrypted and cannot be published. + */ + RTMP_STREAM_PUBLISH_REASON_ENCRYPTED_STREAM_NOT_ALLOWED = 2, + /** + * 3: Timeout for the RTMP or RTMPS streaming. Call the `addPublishStreamUrl` method to publish the streaming again. + */ + RTMP_STREAM_PUBLISH_REASON_CONNECTION_TIMEOUT = 3, + /** + * 4: An error occurs in Agora's streaming server. Call the `addPublishStreamUrl` method to publish the streaming again. + */ + RTMP_STREAM_PUBLISH_REASON_INTERNAL_SERVER_ERROR = 4, + /** + * 5: An error occurs in the CDN server. + */ + RTMP_STREAM_PUBLISH_REASON_RTMP_SERVER_ERROR = 5, + /** + * 6: The RTMP or RTMPS streaming publishes too frequently. + */ + RTMP_STREAM_PUBLISH_REASON_TOO_OFTEN = 6, + /** + * 7: The host publishes more than 10 URLs. Delete the unnecessary URLs before adding new ones. + */ + RTMP_STREAM_PUBLISH_REASON_REACH_LIMIT = 7, + /** + * 8: The host manipulates other hosts' URLs. Check your app logic. + */ + RTMP_STREAM_PUBLISH_REASON_NOT_AUTHORIZED = 8, + /** + * 9: Agora's server fails to find the RTMP or RTMPS streaming. + */ + RTMP_STREAM_PUBLISH_REASON_STREAM_NOT_FOUND = 9, + /** + * 10: The format of the RTMP or RTMPS streaming URL is not supported. Check whether the URL format is correct. + */ + RTMP_STREAM_PUBLISH_REASON_FORMAT_NOT_SUPPORTED = 10, + /** + * 11: The user role is not host, so the user cannot use the CDN live streaming function. Check your application code logic. + */ + RTMP_STREAM_PUBLISH_REASON_NOT_BROADCASTER = 11, // Note: match to ERR_PUBLISH_STREAM_NOT_BROADCASTER in AgoraBase.h + /** + * 13: The `updateRtmpTranscoding` or `setLiveTranscoding` method is called to update the transcoding configuration in a scenario where there is streaming without transcoding. Check your application code logic. + */ + RTMP_STREAM_PUBLISH_REASON_TRANSCODING_NO_MIX_STREAM = 13, // Note: match to ERR_PUBLISH_STREAM_TRANSCODING_NO_MIX_STREAM in AgoraBase.h + /** + * 14: Errors occurred in the host's network. + */ + RTMP_STREAM_PUBLISH_REASON_NET_DOWN = 14, // Note: match to ERR_NET_DOWN in AgoraBase.h + /** + * 15: Your App ID does not have permission to use the CDN live streaming function. + */ + RTMP_STREAM_PUBLISH_REASON_INVALID_APPID = 15, // Note: match to ERR_PUBLISH_STREAM_APPID_INVALID in AgoraBase.h + /** invalid privilege. */ + RTMP_STREAM_PUBLISH_REASON_INVALID_PRIVILEGE = 16, + /** + * 100: The streaming has been stopped normally. After you call `removePublishStreamUrl` to stop streaming, the SDK returns this value. + */ + RTMP_STREAM_UNPUBLISH_REASON_OK = 100, +}; + +/** Events during the RTMP or RTMPS streaming. */ +enum RTMP_STREAMING_EVENT { + /** + * 1: An error occurs when you add a background image or a watermark image to the RTMP or RTMPS stream. + */ + RTMP_STREAMING_EVENT_FAILED_LOAD_IMAGE = 1, + /** + * 2: The streaming URL is already being used for CDN live streaming. If you want to start new streaming, use a new streaming URL. + */ + RTMP_STREAMING_EVENT_URL_ALREADY_IN_USE = 2, + /** + * 3: The feature is not supported. + */ + RTMP_STREAMING_EVENT_ADVANCED_FEATURE_NOT_SUPPORT = 3, + /** + * 4: Client request too frequently. + */ + RTMP_STREAMING_EVENT_REQUEST_TOO_OFTEN = 4, +}; + +/** + * Image properties. + */ +typedef struct RtcImage { + /** + *The HTTP/HTTPS URL address of the image in the live video. The maximum length of this parameter is 1024 bytes. + */ + const char* url; + /** + * The x coordinate (pixel) of the image on the video frame (taking the upper left corner of the video frame as the origin). + */ + int x; + /** + * The y coordinate (pixel) of the image on the video frame (taking the upper left corner of the video frame as the origin). + */ + int y; + /** + * The width (pixel) of the image on the video frame. + */ + int width; + /** + * The height (pixel) of the image on the video frame. + */ + int height; + /** + * The layer index of the watermark or background image. When you use the watermark array to add + * a watermark or multiple watermarks, you must pass a value to `zOrder` in the range [1,255]; + * otherwise, the SDK reports an error. In other cases, zOrder can optionally be passed in the + * range [0,255], with 0 being the default value. 0 means the bottom layer and 255 means the top + * layer. + */ + int zOrder; + /** The transparency level of the image. The value ranges between 0.0 and 1.0: + * + * - 0.0: Completely transparent. + * - 1.0: (Default) Opaque. + */ + double alpha; + + RtcImage() : url(OPTIONAL_NULLPTR), x(0), y(0), width(0), height(0), zOrder(0), alpha(1.0) {} +} RtcImage; +/** + * The configuration for advanced features of the RTMP or RTMPS streaming with transcoding. + * + * If you want to enable the advanced features of streaming with transcoding, contact support@agora.io. + */ +struct LiveStreamAdvancedFeature { + LiveStreamAdvancedFeature() : featureName(OPTIONAL_NULLPTR), opened(false) {} + LiveStreamAdvancedFeature(const char* feat_name, bool open) : featureName(feat_name), opened(open) {} + /** The advanced feature for high-quality video with a lower bitrate. */ + // static const char* LBHQ = "lbhq"; + /** The advanced feature for the optimized video encoder. */ + // static const char* VEO = "veo"; + + /** + * The feature names, including LBHQ (high-quality video with a lower bitrate) and VEO (optimized video encoder). + */ + const char* featureName; + + /** + * Whether to enable the advanced features of streaming with transcoding: + * - `true`: Enable the advanced feature. + * - `false`: (Default) Disable the advanced feature. + */ + bool opened; +} ; + +/** + * Connection state types. + */ +enum CONNECTION_STATE_TYPE +{ + /** + * 1: The SDK is disconnected from the Agora edge server. The state indicates the SDK is in one of the following phases: + * - The initial state before calling the `joinChannel` method. + * - The app calls the `leaveChannel` method. + */ + CONNECTION_STATE_DISCONNECTED = 1, + /** + * 2: The SDK is connecting to the Agora edge server. This state indicates that the SDK is + * establishing a connection with the specified channel after the app calls `joinChannel`. + * - If the SDK successfully joins the channel, it triggers the `onConnectionStateChanged` + * callback and the connection state switches to `CONNECTION_STATE_CONNECTED`. + * - After the connection is established, the SDK also initializes the media and triggers + * `onJoinChannelSuccess` when everything is ready. + */ + CONNECTION_STATE_CONNECTING = 2, + /** + * 3: The SDK is connected to the Agora edge server. This state also indicates that the user + * has joined a channel and can now publish or subscribe to a media stream in the channel. + * If the connection to the Agora edge server is lost because, for example, the network is down + * or switched, the SDK automatically tries to reconnect and triggers `onConnectionStateChanged` + * that indicates the connection state switches to `CONNECTION_STATE_RECONNECTING`. + */ + CONNECTION_STATE_CONNECTED = 3, + /** + * 4: The SDK keeps reconnecting to the Agora edge server. The SDK keeps rejoining the channel + * after being disconnected from a joined channel because of network issues. + * - If the SDK cannot rejoin the channel within 10 seconds, it triggers `onConnectionLost`, + * stays in the `CONNECTION_STATE_RECONNECTING` state, and keeps rejoining the channel. + * - If the SDK fails to rejoin the channel 20 minutes after being disconnected from the Agora + * edge server, the SDK triggers the `onConnectionStateChanged` callback, switches to the + * `CONNECTION_STATE_FAILED` state, and stops rejoining the channel. + */ + CONNECTION_STATE_RECONNECTING = 4, + /** + * 5: The SDK fails to connect to the Agora edge server or join the channel. This state indicates + * that the SDK stops trying to rejoin the channel. You must call `leaveChannel` to leave the + * channel. + * - You can call `joinChannel` to rejoin the channel. + * - If the SDK is banned from joining the channel by the Agora edge server through the RESTful + * API, the SDK triggers the `onConnectionStateChanged` callback. + */ + CONNECTION_STATE_FAILED = 5, +}; + +/** + * Transcoding configurations of each host. + */ +struct TranscodingUser { + /** + * The user ID of the host. + */ + uid_t uid; + /** + * The x coordinate (pixel) of the host's video on the output video frame (taking the upper left corner of the video frame as the origin). The value range is [0, width], where width is the `width` set in `LiveTranscoding`. + */ + int x; + /** + * The y coordinate (pixel) of the host's video on the output video frame (taking the upper left corner of the video frame as the origin). The value range is [0, height], where height is the `height` set in `LiveTranscoding`. + */ + int y; + /** + * The width (pixel) of the host's video. + */ + int width; + /** + * The height (pixel) of the host's video. + */ + int height; + /** + * The layer index number of the host's video. The value range is [0, 100]. + * - 0: (Default) The host's video is the bottom layer. + * - 100: The host's video is the top layer. + * + * If the value is beyond this range, the SDK reports the error code `ERR_INVALID_ARGUMENT`. + */ + int zOrder; + /** + * The transparency of the host's video. The value range is [0.0, 1.0]. + * - 0.0: Completely transparent. + * - 1.0: (Default) Opaque. + */ + double alpha; + /** + * The audio channel used by the host's audio in the output audio. The default value is 0, and the value range is [0, 5]. + * - `0`: (Recommended) The defaut setting, which supports dual channels at most and depends on the upstream of the host. + * - `1`: The host's audio uses the FL audio channel. If the host's upstream uses multiple audio channels, the Agora server mixes them into mono first. + * - `2`: The host's audio uses the FC audio channel. If the host's upstream uses multiple audio channels, the Agora server mixes them into mono first. + * - `3`: The host's audio uses the FR audio channel. If the host's upstream uses multiple audio channels, the Agora server mixes them into mono first. + * - `4`: The host's audio uses the BL audio channel. If the host's upstream uses multiple audio channels, the Agora server mixes them into mono first. + * - `5`: The host's audio uses the BR audio channel. If the host's upstream uses multiple audio channels, the Agora server mixes them into mono first. + * - `0xFF` or a value greater than 5: The host's audio is muted, and the Agora server removes the host's audio. + * + * @note If the value is not `0`, a special player is required. + */ + int audioChannel; + + TranscodingUser() + : uid(0), + x(0), + y(0), + width(0), + height(0), + zOrder(0), + alpha(1.0), + audioChannel(0) {} +}; + +/** + * Transcoding configurations for Media Push. + */ +struct LiveTranscoding { + /** The width of the video in pixels. The default value is 360. + * - When pushing video streams to the CDN, the value range of `width` is [64,1920]. + * If the value is less than 64, Agora server automatically adjusts it to 64; if the + * value is greater than 1920, Agora server automatically adjusts it to 1920. + * - When pushing audio streams to the CDN, set `width` and `height` as 0. + */ + int width; + /** The height of the video in pixels. The default value is 640. + * - When pushing video streams to the CDN, the value range of `height` is [64,1080]. + * If the value is less than 64, Agora server automatically adjusts it to 64; if the + * value is greater than 1080, Agora server automatically adjusts it to 1080. + * - When pushing audio streams to the CDN, set `width` and `height` as 0. + */ + int height; + /** Bitrate of the CDN live output video stream. The default value is 400 Kbps. + + Set this parameter according to the Video Bitrate Table. If you set a bitrate beyond the proper range, the SDK automatically adapts it to a value within the range. + */ + int videoBitrate; + /** Frame rate of the output video stream set for the CDN live streaming. The default value is 15 fps, and the value range is (0,30]. + + @note The Agora server adjusts any value over 30 to 30. + */ + int videoFramerate; + + /** **DEPRECATED** Latency mode: + + - true: Low latency with unassured quality. + - false: (Default) High latency with assured quality. + */ + bool lowLatency; + + /** Video GOP in frames. The default value is 30 fps. + */ + int videoGop; + /** Self-defined video codec profile: #VIDEO_CODEC_PROFILE_TYPE. + + @note If you set this parameter to other values, Agora adjusts it to the default value of 100. + */ + VIDEO_CODEC_PROFILE_TYPE videoCodecProfile; + /** The background color in RGB hex value. Value only. Do not include a preceeding #. For example, 0xFFB6C1 (light pink). The default value is 0x000000 (black). + */ + unsigned int backgroundColor; + /** Video codec profile types for Media Push. See VIDEO_CODEC_TYPE_FOR_STREAM. */ + VIDEO_CODEC_TYPE_FOR_STREAM videoCodecType; + /** The number of users in the live interactive streaming. + * The value range is [0, 17]. + */ + unsigned int userCount; + /** Manages the user layout configuration in the Media Push. Agora supports a maximum of 17 transcoding users in a Media Push channel. See `TranscodingUser`. + */ + TranscodingUser* transcodingUsers; + /** Reserved property. Extra user-defined information to send SEI for the H.264/H.265 video stream to the CDN live client. Maximum length: 4096 Bytes. + + For more information on SEI frame, see [SEI-related questions](https://docs.agora.io/en/faq/sei). + */ + const char* transcodingExtraInfo; + + /** **DEPRECATED** The metadata sent to the CDN live client. + */ + const char* metadata; + /** The watermark on the live video. The image format needs to be PNG. See `RtcImage`. + + You can add one watermark, or add multiple watermarks using an array. This parameter is used with `watermarkCount`. + */ + RtcImage* watermark; + /** + * The number of watermarks on the live video. The total number of watermarks and background images can range from 0 to 10. This parameter is used with `watermark`. + */ + unsigned int watermarkCount; + + /** The number of background images on the live video. The image format needs to be PNG. See `RtcImage`. + * + * You can add a background image or use an array to add multiple background images. This parameter is used with `backgroundImageCount`. + */ + RtcImage* backgroundImage; + /** + * The number of background images on the live video. The total number of watermarks and background images can range from 0 to 10. This parameter is used with `backgroundImage`. + */ + unsigned int backgroundImageCount; + + /** The audio sampling rate (Hz) of the output media stream. See #AUDIO_SAMPLE_RATE_TYPE. + */ + AUDIO_SAMPLE_RATE_TYPE audioSampleRate; + /** Bitrate (Kbps) of the audio output stream for Media Push. The default value is 48, and the highest value is 128. + */ + int audioBitrate; + /** The number of audio channels for Media Push. Agora recommends choosing 1 (mono), or 2 (stereo) audio channels. Special players are required if you choose 3, 4, or 5. + * - 1: (Default) Mono. + * - 2: Stereo. + * - 3: Three audio channels. + * - 4: Four audio channels. + * - 5: Five audio channels. + */ + int audioChannels; + /** Audio codec profile type for Media Push. See #AUDIO_CODEC_PROFILE_TYPE. + */ + AUDIO_CODEC_PROFILE_TYPE audioCodecProfile; + /** Advanced features of the RTMP or RTMPS streaming with transcoding. See LiveStreamAdvancedFeature. + */ + LiveStreamAdvancedFeature* advancedFeatures; + + /** The number of enabled advanced features. The default value is 0. */ + unsigned int advancedFeatureCount; + + LiveTranscoding() + : width(360), + height(640), + videoBitrate(400), + videoFramerate(15), + lowLatency(false), + videoGop(30), + videoCodecProfile(VIDEO_CODEC_PROFILE_HIGH), + backgroundColor(0x000000), + videoCodecType(VIDEO_CODEC_H264_FOR_STREAM), + userCount(0), + transcodingUsers(OPTIONAL_NULLPTR), + transcodingExtraInfo(OPTIONAL_NULLPTR), + metadata(OPTIONAL_NULLPTR), + watermark(OPTIONAL_NULLPTR), + watermarkCount(0), + backgroundImage(OPTIONAL_NULLPTR), + backgroundImageCount(0), + audioSampleRate(AUDIO_SAMPLE_RATE_48000), + audioBitrate(48), + audioChannels(1), + audioCodecProfile(AUDIO_CODEC_PROFILE_LC_AAC), + advancedFeatures(OPTIONAL_NULLPTR), + advancedFeatureCount(0) {} +}; + +/** + * The video streams for the video mixing on the local client. + */ +struct TranscodingVideoStream { + /** + * The source type of video for the video mixing on the local client. See #VIDEO_SOURCE_TYPE. + */ + VIDEO_SOURCE_TYPE sourceType; + /** + * The ID of the remote user. + * @note Use this parameter only when the source type of the video for the video mixing on the local client is `VIDEO_SOURCE_REMOTE`. + */ + uid_t remoteUserUid; + /** + * The URL of the image. + * @note Use this parameter only when the source type of the video for the video mixing on the local client is `RTC_IMAGE`. + */ + const char* imageUrl; + /** + * MediaPlayer id if sourceType is MEDIA_PLAYER_SOURCE. + */ + int mediaPlayerId; + /** + * The horizontal displacement of the top-left corner of the video for the video mixing on the client relative to the top-left corner (origin) of the canvas for this video mixing. + */ + int x; + /** + * The vertical displacement of the top-left corner of the video for the video mixing on the client relative to the top-left corner (origin) of the canvas for this video mixing. + */ + int y; + /** + * The width (px) of the video for the video mixing on the local client. + */ + int width; + /** + * The height (px) of the video for the video mixing on the local client. + */ + int height; + /** + * The number of the layer to which the video for the video mixing on the local client belongs. The value range is [0,100]. + * - 0: (Default) The layer is at the bottom. + * - 100: The layer is at the top. + */ + int zOrder; + /** + * The transparency of the video for the video mixing on the local client. The value range is [0.0,1.0]. 0.0 means the transparency is completely transparent. 1.0 means the transparency is opaque. + */ + double alpha; + /** + * Whether to mirror the video for the video mixing on the local client. + * - true: Mirroring. + * - false: (Default) Do not mirror. + * @note The paramter only works for videos with the source type `CAMERA`. + */ + bool mirror; + + TranscodingVideoStream() + : sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), + remoteUserUid(0), + imageUrl(OPTIONAL_NULLPTR), + x(0), + y(0), + width(0), + height(0), + zOrder(0), + alpha(1.0), + mirror(false) {} +}; + +/** + * The configuration of the video mixing on the local client. + */ +struct LocalTranscoderConfiguration { + /** + * The number of the video streams for the video mixing on the local client. + */ + unsigned int streamCount; + /** + * The video streams for the video mixing on the local client. See TranscodingVideoStream. + */ + TranscodingVideoStream* videoInputStreams; + /** + * The encoding configuration of the mixed video stream after the video mixing on the local client. See VideoEncoderConfiguration. + */ + VideoEncoderConfiguration videoOutputConfiguration; + /** + * Whether to use the timestamp when the primary camera captures the video frame as the timestamp of the mixed video frame. + * - true: (Default) Use the timestamp of the captured video frame as the timestamp of the mixed video frame. + * - false: Do not use the timestamp of the captured video frame as the timestamp of the mixed video frame. Instead, use the timestamp when the mixed video frame is constructed. + */ + bool syncWithPrimaryCamera; + + LocalTranscoderConfiguration() : streamCount(0), videoInputStreams(OPTIONAL_NULLPTR), videoOutputConfiguration(), syncWithPrimaryCamera(true) {} +}; + +enum VIDEO_TRANSCODER_ERROR { + /** + * The video track of the video source is not started. + */ + VT_ERR_VIDEO_SOURCE_NOT_READY = 1, + /** + * The video source type is not supported. + */ + VT_ERR_INVALID_VIDEO_SOURCE_TYPE = 2, + /** + * The image url is not correctly of image source. + */ + VT_ERR_INVALID_IMAGE_PATH = 3, + /** + * The image format not the type png/jpeg/gif of image source. + */ + VT_ERR_UNSUPPORT_IMAGE_FORMAT = 4, + /** + * The layout is invalid such as width is zero. + */ + VT_ERR_INVALID_LAYOUT = 5, + /** + * Internal error. + */ + VT_ERR_INTERNAL = 20 +}; + +/** + * Configurations of the last-mile network test. + */ +struct LastmileProbeConfig { + /** + * Determines whether to test the uplink network. Some users, for example, + * the audience in a live broadcast channel, do not need such a test: + * - true: Test. + * - false: Do not test. + */ + bool probeUplink; + /** + * Determines whether to test the downlink network: + * - true: Test. + * - false: Do not test. + */ + bool probeDownlink; + /** + * The expected maximum sending bitrate (bps) of the local user. The value range is [100000, 5000000]. We recommend setting this parameter + * according to the bitrate value set by `setVideoEncoderConfiguration`. + */ + unsigned int expectedUplinkBitrate; + /** + * The expected maximum receiving bitrate (bps) of the local user. The value range is [100000,5000000]. + */ + unsigned int expectedDownlinkBitrate; +}; + +/** + * The status of the last-mile network tests. + */ +enum LASTMILE_PROBE_RESULT_STATE { + /** + * 1: The last-mile network probe test is complete. + */ + LASTMILE_PROBE_RESULT_COMPLETE = 1, + /** + * 2: The last-mile network probe test is incomplete because the bandwidth estimation is not available due to limited test resources. + */ + LASTMILE_PROBE_RESULT_INCOMPLETE_NO_BWE = 2, + /** + * 3: The last-mile network probe test is not carried out, probably due to poor network conditions. + */ + LASTMILE_PROBE_RESULT_UNAVAILABLE = 3 +}; + +/** + * Results of the uplink or downlink last-mile network test. + */ +struct LastmileProbeOneWayResult { + /** + * The packet loss rate (%). + */ + unsigned int packetLossRate; + /** + * The network jitter (ms). + */ + unsigned int jitter; + /** + * The estimated available bandwidth (bps). + */ + unsigned int availableBandwidth; + + LastmileProbeOneWayResult() : packetLossRate(0), + jitter(0), + availableBandwidth(0) {} +}; + +/** + * Results of the uplink and downlink last-mile network tests. + */ +struct LastmileProbeResult { + /** + * The status of the last-mile network tests. See #LASTMILE_PROBE_RESULT_STATE. + */ + LASTMILE_PROBE_RESULT_STATE state; + /** + * Results of the uplink last-mile network test. For details, see LastmileProbeOneWayResult. + */ + LastmileProbeOneWayResult uplinkReport; + /** + * Results of the downlink last-mile network test. For details, see LastmileProbeOneWayResult. + */ + LastmileProbeOneWayResult downlinkReport; + /** + * The round-trip time (ms). + */ + unsigned int rtt; + + LastmileProbeResult() + : state(LASTMILE_PROBE_RESULT_UNAVAILABLE), + rtt(0) {} +}; + +/** + * Reasons causing the change of the connection state. + */ +enum CONNECTION_CHANGED_REASON_TYPE +{ + /** + * 0: The SDK is connecting to the server. + */ + CONNECTION_CHANGED_CONNECTING = 0, + /** + * 1: The SDK has joined the channel successfully. + */ + CONNECTION_CHANGED_JOIN_SUCCESS = 1, + /** + * 2: The connection between the SDK and the server is interrupted. + */ + CONNECTION_CHANGED_INTERRUPTED = 2, + /** + * 3: The connection between the SDK and the server is banned by the server. This error occurs when the user is kicked out of the channel by the server. + */ + CONNECTION_CHANGED_BANNED_BY_SERVER = 3, + /** + * 4: The SDK fails to join the channel. When the SDK fails to join the channel for more than 20 minutes, this error occurs and the SDK stops reconnecting to the channel. + */ + CONNECTION_CHANGED_JOIN_FAILED = 4, + /** + * 5: The SDK has left the channel. + */ + CONNECTION_CHANGED_LEAVE_CHANNEL = 5, + /** + * 6: The connection fails because the App ID is not valid. + */ + CONNECTION_CHANGED_INVALID_APP_ID = 6, + /** + * 7: The connection fails because the channel name is not valid. Please rejoin the channel with a valid channel name. + */ + CONNECTION_CHANGED_INVALID_CHANNEL_NAME = 7, + /** + * 8: The connection fails because the token is not valid. Typical reasons include: + * - The App Certificate for the project is enabled in Agora Console, but you do not use a token when joining the channel. If you enable the App Certificate, you must use a token to join the channel. + * - The `uid` specified when calling `joinChannel` to join the channel is inconsistent with the `uid` passed in when generating the token. + */ + CONNECTION_CHANGED_INVALID_TOKEN = 8, + /** + * 9: The connection fails because the token has expired. + */ + CONNECTION_CHANGED_TOKEN_EXPIRED = 9, + /** + * 10: The connection is rejected by the server. Typical reasons include: + * - The user is already in the channel and still calls a method, for example, `joinChannel`, to join the channel. Stop calling this method to clear this error. + * - The user tries to join the channel when conducting a pre-call test. The user needs to call the channel after the call test ends. + */ + CONNECTION_CHANGED_REJECTED_BY_SERVER = 10, + /** + * 11: The connection changes to reconnecting because the SDK has set a proxy server. + */ + CONNECTION_CHANGED_SETTING_PROXY_SERVER = 11, + /** + * 12: The connection state changed because the token is renewed. + */ + CONNECTION_CHANGED_RENEW_TOKEN = 12, + /** + * 13: The IP address of the client has changed, possibly because the network type, IP address, or port has been changed. + */ + CONNECTION_CHANGED_CLIENT_IP_ADDRESS_CHANGED = 13, + /** + * 14: Timeout for the keep-alive of the connection between the SDK and the Agora edge server. The connection state changes to CONNECTION_STATE_RECONNECTING. + */ + CONNECTION_CHANGED_KEEP_ALIVE_TIMEOUT = 14, + /** + * 15: The SDK has rejoined the channel successfully. + */ + CONNECTION_CHANGED_REJOIN_SUCCESS = 15, + /** + * 16: The connection between the SDK and the server is lost. + */ + CONNECTION_CHANGED_LOST = 16, + /** + * 17: The change of connection state is caused by echo test. + */ + CONNECTION_CHANGED_ECHO_TEST = 17, + /** + * 18: The local IP Address is changed by user. + */ + CONNECTION_CHANGED_CLIENT_IP_ADDRESS_CHANGED_BY_USER = 18, + /** + * 19: The connection is failed due to join the same channel on another device with the same uid. + */ + CONNECTION_CHANGED_SAME_UID_LOGIN = 19, + /** + * 20: The connection is failed due to too many broadcasters in the channel. + */ + CONNECTION_CHANGED_TOO_MANY_BROADCASTERS = 20, + + /** + * 21: The connection is failed due to license validation failure. + */ + CONNECTION_CHANGED_LICENSE_VALIDATION_FAILURE = 21, + /* + * 22: The connection is failed due to certification verify failure. + */ + CONNECTION_CHANGED_CERTIFICATION_VERYFY_FAILURE = 22, + /** + * 23: The connection is failed due to the lack of granting permission to the stream channel. + */ + CONNECTION_CHANGED_STREAM_CHANNEL_NOT_AVAILABLE = 23, + /** + * 24: The connection is failed due to join channel with an inconsistent appid. + */ + CONNECTION_CHANGED_INCONSISTENT_APPID = 24, +}; + +/** + * The reason of changing role's failure. + */ +enum CLIENT_ROLE_CHANGE_FAILED_REASON { + /** + * 1: Too many broadcasters in the channel. + */ + CLIENT_ROLE_CHANGE_FAILED_TOO_MANY_BROADCASTERS = 1, + /** + * 2: The operation of changing role is not authorized. + */ + CLIENT_ROLE_CHANGE_FAILED_NOT_AUTHORIZED = 2, + /** + * 3: The operation of changing role is timeout. + */ + CLIENT_ROLE_CHANGE_FAILED_REQUEST_TIME_OUT = 3, + /** + * 4: The operation of changing role is interrupted since we lost connection with agora service. + */ + CLIENT_ROLE_CHANGE_FAILED_CONNECTION_FAILED = 4, +}; + +/** + * The reason of notifying the user of a message. + */ +enum WLACC_MESSAGE_REASON { + /** + * WIFI signal is weak. + */ + WLACC_MESSAGE_REASON_WEAK_SIGNAL = 0, + /** + * Channel congestion. + */ + WLACC_MESSAGE_REASON_CHANNEL_CONGESTION = 1, +}; + +/** + * Suggest an action for the user. + */ +enum WLACC_SUGGEST_ACTION { + /** + * Please get close to AP. + */ + WLACC_SUGGEST_ACTION_CLOSE_TO_WIFI = 0, + /** + * The user is advised to connect to the prompted SSID. + */ + WLACC_SUGGEST_ACTION_CONNECT_SSID = 1, + /** + * The user is advised to check whether the AP supports 5G band and enable 5G band (the aciton link is attached), or purchases an AP that supports 5G. AP does not support 5G band. + */ + WLACC_SUGGEST_ACTION_CHECK_5G = 2, + /** + * The user is advised to change the SSID of the 2.4G or 5G band (the aciton link is attached). The SSID of the 2.4G band AP is the same as that of the 5G band. + */ + WLACC_SUGGEST_ACTION_MODIFY_SSID = 3, +}; + +/** + * Indicator optimization degree. + */ +struct WlAccStats { + /** + * End-to-end delay optimization percentage. + */ + unsigned short e2eDelayPercent; + /** + * Frozen Ratio optimization percentage. + */ + unsigned short frozenRatioPercent; + /** + * Loss Rate optimization percentage. + */ + unsigned short lossRatePercent; +}; + +/** + * The network type. + */ +enum NETWORK_TYPE { + /** + * -1: The network type is unknown. + */ + NETWORK_TYPE_UNKNOWN = -1, + /** + * 0: The SDK disconnects from the network. + */ + NETWORK_TYPE_DISCONNECTED = 0, + /** + * 1: The network type is LAN. + */ + NETWORK_TYPE_LAN = 1, + /** + * 2: The network type is Wi-Fi (including hotspots). + */ + NETWORK_TYPE_WIFI = 2, + /** + * 3: The network type is mobile 2G. + */ + NETWORK_TYPE_MOBILE_2G = 3, + /** + * 4: The network type is mobile 3G. + */ + NETWORK_TYPE_MOBILE_3G = 4, + /** + * 5: The network type is mobile 4G. + */ + NETWORK_TYPE_MOBILE_4G = 5, + /** + * 6: The network type is mobile 5G. + */ + NETWORK_TYPE_MOBILE_5G = 6, +}; + +/** + * The mode of setting up video views. + */ +enum VIDEO_VIEW_SETUP_MODE { + /** + * 0: replace one view + */ + VIDEO_VIEW_SETUP_REPLACE = 0, + /** + * 1: add one view + */ + VIDEO_VIEW_SETUP_ADD = 1, + /** + * 2: remove one view + */ + VIDEO_VIEW_SETUP_REMOVE = 2, +}; + +/** + * Attributes of video canvas object. + */ +struct VideoCanvas { + /** + * The user id of local video. + */ + uid_t uid; + + /** + * The uid of video stream composing the video stream from transcoder which will be drawn on this video canvas. + */ + uid_t subviewUid; + /** + * Video display window. + */ + view_t view; + /** + * A RGBA value indicates background color of the render view. Defaults to 0x00000000. + */ + uint32_t backgroundColor; + /** + * The video render mode. See \ref agora::media::base::RENDER_MODE_TYPE "RENDER_MODE_TYPE". + * The default value is RENDER_MODE_HIDDEN. + */ + media::base::RENDER_MODE_TYPE renderMode; + /** + * The video mirror mode. See \ref VIDEO_MIRROR_MODE_TYPE "VIDEO_MIRROR_MODE_TYPE". + * The default value is VIDEO_MIRROR_MODE_AUTO. + * @note + * - For the mirror mode of the local video view: + * If you use a front camera, the SDK enables the mirror mode by default; + * if you use a rear camera, the SDK disables the mirror mode by default. + * - For the remote user: The mirror mode is disabled by default. + */ + VIDEO_MIRROR_MODE_TYPE mirrorMode; + /** + * The mode of setting up video view. See \ref VIDEO_VIEW_SETUP_MODE "VIDEO_VIEW_SETUP_MODE" + * The default value is VIDEO_VIEW_SETUP_REPLACE. + */ + VIDEO_VIEW_SETUP_MODE setupMode; + /** + * The video source type. See \ref VIDEO_SOURCE_TYPE "VIDEO_SOURCE_TYPE". + * The default value is VIDEO_SOURCE_CAMERA_PRIMARY. + */ + VIDEO_SOURCE_TYPE sourceType; + /** + * The media player id of AgoraMediaPlayer. It should set this parameter when the + * sourceType is VIDEO_SOURCE_MEDIA_PLAYER to show the video that AgoraMediaPlayer is playing. + * You can get this value by calling the method \ref getMediaPlayerId(). + */ + int mediaPlayerId; + /** + * If you want to display a certain part of a video frame, you can set + * this value to crop the video frame to show. + * The default value is empty(that is, if it has zero width or height), which means no cropping. + */ + Rectangle cropArea; + /** + * Whether to apply alpha mask to the video frame if exsit: + * true: Apply alpha mask to video frame. + * false: (Default) Do not apply alpha mask to video frame. + */ + bool enableAlphaMask; + /** + * The video frame position in pipeline. See \ref VIDEO_MODULE_POSITION "VIDEO_MODULE_POSITION". + * The default value is POSITION_POST_CAPTURER. + */ + media::base::VIDEO_MODULE_POSITION position; + + VideoCanvas() + : uid(0), subviewUid(0), view(NULL), backgroundColor(0x00000000), renderMode(media::base::RENDER_MODE_HIDDEN), mirrorMode(VIDEO_MIRROR_MODE_AUTO), + setupMode(VIDEO_VIEW_SETUP_REPLACE), sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} + + VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt) + : uid(0), subviewUid(0), view(v), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), + sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} + + VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt, uid_t u) + : uid(u), subviewUid(0), view(v), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), + sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} + + VideoCanvas(view_t v, media::base::RENDER_MODE_TYPE m, VIDEO_MIRROR_MODE_TYPE mt, uid_t u, uid_t subu) + : uid(u), subviewUid(subu), view(v), backgroundColor(0x00000000), renderMode(m), mirrorMode(mt), setupMode(VIDEO_VIEW_SETUP_REPLACE), + sourceType(VIDEO_SOURCE_CAMERA_PRIMARY), mediaPlayerId(-ERR_NOT_READY), + cropArea(0, 0, 0, 0), enableAlphaMask(false), position(media::base::POSITION_POST_CAPTURER) {} +}; + +/** Image enhancement options. + */ +struct BeautyOptions { + /** The contrast level. + */ + enum LIGHTENING_CONTRAST_LEVEL { + /** Low contrast level. */ + LIGHTENING_CONTRAST_LOW = 0, + /** (Default) Normal contrast level. */ + LIGHTENING_CONTRAST_NORMAL = 1, + /** High contrast level. */ + LIGHTENING_CONTRAST_HIGH = 2, + }; + + /** The contrast level, used with the `lighteningLevel` parameter. The larger the value, the greater the contrast between light and dark. See #LIGHTENING_CONTRAST_LEVEL. + */ + LIGHTENING_CONTRAST_LEVEL lighteningContrastLevel; + + /** The brightness level. The value ranges from 0.0 (original) to 1.0. The default value is 0.0. The greater the value, the greater the degree of whitening. */ + float lighteningLevel; + + /** The value ranges from 0.0 (original) to 1.0. The default value is 0.0. The greater the value, the greater the degree of skin grinding. + */ + float smoothnessLevel; + + /** The redness level. The value ranges from 0.0 (original) to 1.0. The default value is 0.0. The larger the value, the greater the rosy degree. + */ + float rednessLevel; + + /** The sharpness level. The value ranges from 0.0 (original) to 1.0. The default value is 0.0. The larger the value, the greater the sharpening degree. + */ + float sharpnessLevel; + + BeautyOptions(LIGHTENING_CONTRAST_LEVEL contrastLevel, float lightening, float smoothness, float redness, float sharpness) : lighteningContrastLevel(contrastLevel), lighteningLevel(lightening), smoothnessLevel(smoothness), rednessLevel(redness), sharpnessLevel(sharpness) {} + + BeautyOptions() : lighteningContrastLevel(LIGHTENING_CONTRAST_NORMAL), lighteningLevel(0), smoothnessLevel(0), rednessLevel(0), sharpnessLevel(0) {} +}; + +struct LowlightEnhanceOptions { + /** + * The low-light enhancement mode. + */ + enum LOW_LIGHT_ENHANCE_MODE { + /** 0: (Default) Automatic mode. The SDK automatically enables or disables the low-light enhancement feature according to the ambient light to compensate for the lighting level or prevent overexposure, as necessary. */ + LOW_LIGHT_ENHANCE_AUTO = 0, + /** Manual mode. Users need to enable or disable the low-light enhancement feature manually. */ + LOW_LIGHT_ENHANCE_MANUAL = 1, + }; + /** + * The low-light enhancement level. + */ + enum LOW_LIGHT_ENHANCE_LEVEL { + /** + * 0: (Default) Promotes video quality during low-light enhancement. It processes the brightness, details, and noise of the video image. The performance consumption is moderate, the processing speed is moderate, and the overall video quality is optimal. + */ + LOW_LIGHT_ENHANCE_LEVEL_HIGH_QUALITY = 0, + /** + * Promotes performance during low-light enhancement. It processes the brightness and details of the video image. The processing speed is faster. + */ + LOW_LIGHT_ENHANCE_LEVEL_FAST = 1, + }; + + /** The low-light enhancement mode. See #LOW_LIGHT_ENHANCE_MODE. + */ + LOW_LIGHT_ENHANCE_MODE mode; + + /** The low-light enhancement level. See #LOW_LIGHT_ENHANCE_LEVEL. + */ + LOW_LIGHT_ENHANCE_LEVEL level; + + LowlightEnhanceOptions(LOW_LIGHT_ENHANCE_MODE lowlightMode, LOW_LIGHT_ENHANCE_LEVEL lowlightLevel) : mode(lowlightMode), level(lowlightLevel) {} + + LowlightEnhanceOptions() : mode(LOW_LIGHT_ENHANCE_AUTO), level(LOW_LIGHT_ENHANCE_LEVEL_HIGH_QUALITY) {} +}; +/** + * The video noise reduction options. + * + * @since v4.0.0 + */ +struct VideoDenoiserOptions { + /** The video noise reduction mode. + */ + enum VIDEO_DENOISER_MODE { + /** 0: (Default) Automatic mode. The SDK automatically enables or disables the video noise reduction feature according to the ambient light. */ + VIDEO_DENOISER_AUTO = 0, + /** Manual mode. Users need to enable or disable the video noise reduction feature manually. */ + VIDEO_DENOISER_MANUAL = 1, + }; + /** + * The video noise reduction level. + */ + enum VIDEO_DENOISER_LEVEL { + /** + * 0: (Default) Promotes video quality during video noise reduction. `HIGH_QUALITY` balances performance consumption and video noise reduction quality. + * The performance consumption is moderate, the video noise reduction speed is moderate, and the overall video quality is optimal. + */ + VIDEO_DENOISER_LEVEL_HIGH_QUALITY = 0, + /** + * Promotes reducing performance consumption during video noise reduction. `FAST` prioritizes reducing performance consumption over video noise reduction quality. + * The performance consumption is lower, and the video noise reduction speed is faster. To avoid a noticeable shadowing effect (shadows trailing behind moving objects) in the processed video, Agora recommends that you use `FAST` when the camera is fixed. + */ + VIDEO_DENOISER_LEVEL_FAST = 1, + /** + * Enhanced video noise reduction. `STRENGTH` prioritizes video noise reduction quality over reducing performance consumption. + * The performance consumption is higher, the video noise reduction speed is slower, and the video noise reduction quality is better. + * If `HIGH_QUALITY` is not enough for your video noise reduction needs, you can use `STRENGTH`. + */ + VIDEO_DENOISER_LEVEL_STRENGTH = 2, + }; + /** The video noise reduction mode. See #VIDEO_DENOISER_MODE. + */ + VIDEO_DENOISER_MODE mode; + + /** The video noise reduction level. See #VIDEO_DENOISER_LEVEL. + */ + VIDEO_DENOISER_LEVEL level; + + VideoDenoiserOptions(VIDEO_DENOISER_MODE denoiserMode, VIDEO_DENOISER_LEVEL denoiserLevel) : mode(denoiserMode), level(denoiserLevel) {} + + VideoDenoiserOptions() : mode(VIDEO_DENOISER_AUTO), level(VIDEO_DENOISER_LEVEL_HIGH_QUALITY) {} +}; + +/** The color enhancement options. + * + * @since v4.0.0 + */ +struct ColorEnhanceOptions { + /** The level of color enhancement. The value range is [0.0,1.0]. `0.0` is the default value, which means no color enhancement is applied to the video. The higher the value, the higher the level of color enhancement. + */ + float strengthLevel; + + /** The level of skin tone protection. The value range is [0.0,1.0]. `0.0` means no skin tone protection. The higher the value, the higher the level of skin tone protection. + * The default value is `1.0`. When the level of color enhancement is higher, the portrait skin tone can be significantly distorted, so you need to set the level of skin tone protection; when the level of skin tone protection is higher, the color enhancement effect can be slightly reduced. + * Therefore, to get the best color enhancement effect, Agora recommends that you adjust `strengthLevel` and `skinProtectLevel` to get the most appropriate values. + */ + float skinProtectLevel; + + ColorEnhanceOptions(float stength, float skinProtect) : strengthLevel(stength), skinProtectLevel(skinProtect) {} + + ColorEnhanceOptions() : strengthLevel(0), skinProtectLevel(1) {} +}; + +/** + * The custom background image. + */ +struct VirtualBackgroundSource { + /** The type of the custom background source. + */ + enum BACKGROUND_SOURCE_TYPE { + /** + * 0: Enable segementation with the captured video frame without replacing the background. + */ + BACKGROUND_NONE = 0, + /** + * 1: (Default) The background source is a solid color. + */ + BACKGROUND_COLOR = 1, + /** + * The background source is a file in PNG or JPG format. + */ + BACKGROUND_IMG = 2, + /** + * The background source is the blurred original video frame. + * */ + BACKGROUND_BLUR = 3, + /** + * The background source is a file in MP4, AVI, MKV, FLV format. + * */ + BACKGROUND_VIDEO = 4, + }; + + /** The degree of blurring applied to the background source. + */ + enum BACKGROUND_BLUR_DEGREE { + /** 1: The degree of blurring applied to the custom background image is low. The user can almost see the background clearly. */ + BLUR_DEGREE_LOW = 1, + /** 2: The degree of blurring applied to the custom background image is medium. It is difficult for the user to recognize details in the background. */ + BLUR_DEGREE_MEDIUM = 2, + /** 3: (Default) The degree of blurring applied to the custom background image is high. The user can barely see any distinguishing features in the background. */ + BLUR_DEGREE_HIGH = 3, + }; + + /** The type of the custom background image. See #BACKGROUND_SOURCE_TYPE. + */ + BACKGROUND_SOURCE_TYPE background_source_type; + + /** + * The color of the custom background image. The format is a hexadecimal integer defined by RGB, without the # sign, + * such as 0xFFB6C1 for light pink. The default value is 0xFFFFFF, which signifies white. The value range + * is [0x000000,0xFFFFFF]. If the value is invalid, the SDK replaces the original background image with a white + * background image. + * + * @note This parameter takes effect only when the type of the custom background image is `BACKGROUND_COLOR`. + */ + unsigned int color; + + /** + * The local absolute path of the custom background image. PNG and JPG formats are supported. If the path is invalid, + * the SDK replaces the original background image with a white background image. + * + * @note This parameter takes effect only when the type of the custom background image is `BACKGROUND_IMG`. + */ + const char* source; + + /** The degree of blurring applied to the custom background image. See BACKGROUND_BLUR_DEGREE. + * @note This parameter takes effect only when the type of the custom background image is `BACKGROUND_BLUR`. + */ + BACKGROUND_BLUR_DEGREE blur_degree; + + VirtualBackgroundSource() : background_source_type(BACKGROUND_COLOR), color(0xffffff), source(OPTIONAL_NULLPTR), blur_degree(BLUR_DEGREE_HIGH) {} +}; + +struct SegmentationProperty { + + enum SEG_MODEL_TYPE { + + SEG_MODEL_AI = 1, + SEG_MODEL_GREEN = 2 + }; + + SEG_MODEL_TYPE modelType; + + float greenCapacity; + + + SegmentationProperty() : modelType(SEG_MODEL_AI), greenCapacity(0.5){} +}; + +/** The type of custom audio track +*/ +enum AUDIO_TRACK_TYPE { + /** + * -1: Invalid audio track + */ + AUDIO_TRACK_INVALID = -1, + /** + * 0: Mixable audio track + * You can push more than one mixable Audio tracks into one RTC connection(channel id + uid), + * and SDK will mix these tracks into one audio track automatically. + * However, compare to direct audio track, mixable track might cause extra 30ms+ delay. + */ + AUDIO_TRACK_MIXABLE = 0, + /** + * 1: Direct audio track + * You can only push one direct (non-mixable) audio track into one RTC connection(channel id + uid). + * Compare to mixable stream, you can have lower lantency using direct audio track. + */ + AUDIO_TRACK_DIRECT = 1, +}; + +/** The configuration of custom audio track +*/ +struct AudioTrackConfig { + /** + * Enable local playback, enabled by default + * true: (Default) Enable local playback + * false: Do not enable local playback + */ + bool enableLocalPlayback; + + AudioTrackConfig() + : enableLocalPlayback(true) {} +}; + +/** + * Preset local voice reverberation options. + * bitmap allocation: + * | bit31 | bit30 - bit24 | bit23 - bit16 | bit15 - bit8 | bit7 - bit0 | + * |---------|--------------------|-----------------------------|--------------|----------------| + * |reserved | 0x1: voice beauty | 0x1: chat beautification | effect types | effect settings| + * | | | 0x2: singing beautification | | | + * | | | 0x3: timbre transform | | | + * | | | 0x4: ultra high_quality | | | + * | |--------------------|-----------------------------| | | + * | | 0x2: audio effect | 0x1: space construction | | | + * | | | 0x2: voice changer effect | | | + * | | | 0x3: style transform | | | + * | | | 0x4: electronic sound | | | + * | | | 0x5: magic tone | | | + * | |--------------------|-----------------------------| | | + * | | 0x3: voice changer | 0x1: voice transform | | | + */ +/** The options for SDK preset voice beautifier effects. + */ +enum VOICE_BEAUTIFIER_PRESET { + /** Turn off voice beautifier effects and use the original voice. + */ + VOICE_BEAUTIFIER_OFF = 0x00000000, + /** A more magnetic voice. + * + * @note Agora recommends using this enumerator to process a male-sounding voice; otherwise, you + * may experience vocal distortion. + */ + CHAT_BEAUTIFIER_MAGNETIC = 0x01010100, + /** A fresher voice. + * + * @note Agora recommends using this enumerator to process a female-sounding voice; otherwise, you + * may experience vocal distortion. + */ + CHAT_BEAUTIFIER_FRESH = 0x01010200, + /** A more vital voice. + * + * @note Agora recommends using this enumerator to process a female-sounding voice; otherwise, you + * may experience vocal distortion. + */ + CHAT_BEAUTIFIER_VITALITY = 0x01010300, + /** + * Singing beautifier effect. + * - If you call `setVoiceBeautifierPreset`(SINGING_BEAUTIFIER), you can beautify a male-sounding voice and add a reverberation effect + * that sounds like singing in a small room. Agora recommends not using `setVoiceBeautifierPreset`(SINGING_BEAUTIFIER) to process + * a female-sounding voice; otherwise, you may experience vocal distortion. + * - If you call `setVoiceBeautifierParameters`(SINGING_BEAUTIFIER, param1, param2), you can beautify a male- or + * female-sounding voice and add a reverberation effect. + */ + SINGING_BEAUTIFIER = 0x01020100, + /** A more vigorous voice. + */ + TIMBRE_TRANSFORMATION_VIGOROUS = 0x01030100, + /** A deeper voice. + */ + TIMBRE_TRANSFORMATION_DEEP = 0x01030200, + /** A mellower voice. + */ + TIMBRE_TRANSFORMATION_MELLOW = 0x01030300, + /** A falsetto voice. + */ + TIMBRE_TRANSFORMATION_FALSETTO = 0x01030400, + /** A fuller voice. + */ + TIMBRE_TRANSFORMATION_FULL = 0x01030500, + /** A clearer voice. + */ + TIMBRE_TRANSFORMATION_CLEAR = 0x01030600, + /** A more resounding voice. + */ + TIMBRE_TRANSFORMATION_RESOUNDING = 0x01030700, + /** A more ringing voice. + */ + TIMBRE_TRANSFORMATION_RINGING = 0x01030800, + /** + * A ultra-high quality voice, which makes the audio clearer and restores more details. + * - To achieve better audio effect quality, Agora recommends that you call `setAudioProfile` + * and set the `profile` to `AUDIO_PROFILE_MUSIC_HIGH_QUALITY(4)` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)` + * and `scenario` to `AUDIO_SCENARIO_HIGH_DEFINITION(6)` before calling `setVoiceBeautifierPreset`. + * - If you have an audio capturing device that can already restore audio details to a high + * degree, Agora recommends that you do not enable ultra-high quality; otherwise, the SDK may + * over-restore audio details, and you may not hear the anticipated voice effect. + */ + ULTRA_HIGH_QUALITY_VOICE = 0x01040100 +}; + +/** Preset voice effects. + * + * For better voice effects, Agora recommends setting the `profile` parameter of `setAudioProfile` to `AUDIO_PROFILE_MUSIC_HIGH_QUALITY` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO` before using the following presets: + * + * - `ROOM_ACOUSTICS_KTV` + * - `ROOM_ACOUSTICS_VOCAL_CONCERT` + * - `ROOM_ACOUSTICS_STUDIO` + * - `ROOM_ACOUSTICS_PHONOGRAPH` + * - `ROOM_ACOUSTICS_SPACIAL` + * - `ROOM_ACOUSTICS_ETHEREAL` + * - `VOICE_CHANGER_EFFECT_UNCLE` + * - `VOICE_CHANGER_EFFECT_OLDMAN` + * - `VOICE_CHANGER_EFFECT_BOY` + * - `VOICE_CHANGER_EFFECT_SISTER` + * - `VOICE_CHANGER_EFFECT_GIRL` + * - `VOICE_CHANGER_EFFECT_PIGKING` + * - `VOICE_CHANGER_EFFECT_HULK` + * - `PITCH_CORRECTION` + */ +enum AUDIO_EFFECT_PRESET { + /** Turn off voice effects, that is, use the original voice. + */ + AUDIO_EFFECT_OFF = 0x00000000, + /** The voice effect typical of a KTV venue. + */ + ROOM_ACOUSTICS_KTV = 0x02010100, + /** The voice effect typical of a concert hall. + */ + ROOM_ACOUSTICS_VOCAL_CONCERT = 0x02010200, + /** The voice effect typical of a recording studio. + */ + ROOM_ACOUSTICS_STUDIO = 0x02010300, + /** The voice effect typical of a vintage phonograph. + */ + ROOM_ACOUSTICS_PHONOGRAPH = 0x02010400, + /** The virtual stereo effect, which renders monophonic audio as stereo audio. + * + * @note Before using this preset, set the `profile` parameter of `setAudioProfile` + * to `AUDIO_PROFILE_MUSIC_STANDARD_STEREO(3)` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)`; + * otherwise, the preset setting is invalid. + */ + ROOM_ACOUSTICS_VIRTUAL_STEREO = 0x02010500, + /** A more spatial voice effect. + */ + ROOM_ACOUSTICS_SPACIAL = 0x02010600, + /** A more ethereal voice effect. + */ + ROOM_ACOUSTICS_ETHEREAL = 0x02010700, + /** A 3D voice effect that makes the voice appear to be moving around the user. The default cycle + * period of the 3D voice effect is 10 seconds. To change the cycle period, call `setAudioEffectParameters` + * after this method. + * + * @note + * - Before using this preset, set the `profile` parameter of `setAudioProfile` to + * `AUDIO_PROFILE_MUSIC_STANDARD_STEREO` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO`; otherwise, + * the preset setting is invalid. + * - If the 3D voice effect is enabled, users need to use stereo audio playback devices to hear + * the anticipated voice effect. + */ + ROOM_ACOUSTICS_3D_VOICE = 0x02010800, + /** virtual suround sound. + * + * @note + * - Agora recommends using this enumerator to process virtual suround sound; otherwise, you may + * not hear the anticipated voice effect. + * - To achieve better audio effect quality, Agora recommends calling \ref + * IRtcEngine::setAudioProfile "setAudioProfile" and setting the `profile` parameter to + * `AUDIO_PROFILE_MUSIC_HIGH_QUALITY(4)` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)` before + * setting this enumerator. + */ + ROOM_ACOUSTICS_VIRTUAL_SURROUND_SOUND = 0x02010900, + /** A middle-aged man's voice. + * + * @note + * Agora recommends using this enumerator to process a male-sounding voice; otherwise, you may + * not hear the anticipated voice effect. + */ + VOICE_CHANGER_EFFECT_UNCLE = 0x02020100, + /** A senior man's voice. + * + * @note Agora recommends using this enumerator to process a male-sounding voice; otherwise, you may + * not hear the anticipated voice effect. + */ + VOICE_CHANGER_EFFECT_OLDMAN = 0x02020200, + /** A boy's voice. + * + * @note Agora recommends using this enumerator to process a male-sounding voice; otherwise, you may + * not hear the anticipated voice effect. + */ + VOICE_CHANGER_EFFECT_BOY = 0x02020300, + /** A young woman's voice. + * + * @note + * - Agora recommends using this enumerator to process a female-sounding voice; otherwise, you may + * not hear the anticipated voice effect. + */ + VOICE_CHANGER_EFFECT_SISTER = 0x02020400, + /** A girl's voice. + * + * @note Agora recommends using this enumerator to process a female-sounding voice; otherwise, you may + * not hear the anticipated voice effect. + */ + VOICE_CHANGER_EFFECT_GIRL = 0x02020500, + /** The voice of Pig King, a character in Journey to the West who has a voice like a growling + * bear. + */ + VOICE_CHANGER_EFFECT_PIGKING = 0x02020600, + /** The Hulk's voice. + */ + VOICE_CHANGER_EFFECT_HULK = 0x02020700, + /** An audio effect typical of R&B music. + * + * @note Before using this preset, set the `profile` parameter of `setAudioProfile` to + - `AUDIO_PROFILE_MUSIC_HIGH_QUALITY` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO`; otherwise, + * the preset setting is invalid. + */ + STYLE_TRANSFORMATION_RNB = 0x02030100, + /** The voice effect typical of popular music. + * + * @note Before using this preset, set the `profile` parameter of `setAudioProfile` to + - `AUDIO_PROFILE_MUSIC_HIGH_QUALITY` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO`; otherwise, + * the preset setting is invalid. + */ + STYLE_TRANSFORMATION_POPULAR = 0x02030200, + /** A pitch correction effect that corrects the user's pitch based on the pitch of the natural C + * major scale. After setting this voice effect, you can call `setAudioEffectParameters` to adjust + * the basic mode of tuning and the pitch of the main tone. + */ + PITCH_CORRECTION = 0x02040100, + + /** Todo: Electronic sound, Magic tone haven't been implemented. + * + */ +}; + +/** The options for SDK preset voice conversion. + */ +enum VOICE_CONVERSION_PRESET { + /** Turn off voice conversion and use the original voice. + */ + VOICE_CONVERSION_OFF = 0x00000000, + /** A gender-neutral voice. To avoid audio distortion, ensure that you use this enumerator to process a female-sounding voice. + */ + VOICE_CHANGER_NEUTRAL = 0x03010100, + /** A sweet voice. To avoid audio distortion, ensure that you use this enumerator to process a female-sounding voice. + */ + VOICE_CHANGER_SWEET = 0x03010200, + /** A steady voice. To avoid audio distortion, ensure that you use this enumerator to process a male-sounding voice. + */ + VOICE_CHANGER_SOLID = 0x03010300, + /** A deep voice. To avoid audio distortion, ensure that you use this enumerator to process a male-sounding voice. + */ + VOICE_CHANGER_BASS = 0x03010400, + /** A voice like a cartoon character. + */ + VOICE_CHANGER_CARTOON = 0x03010500, + /** A voice like a child. + */ + VOICE_CHANGER_CHILDLIKE = 0x03010600, + /** A voice like a phone operator. + */ + VOICE_CHANGER_PHONE_OPERATOR = 0x03010700, + /** A monster voice. + */ + VOICE_CHANGER_MONSTER = 0x03010800, + /** A voice like Transformers. + */ + VOICE_CHANGER_TRANSFORMERS = 0x03010900, + /** A voice like Groot. + */ + VOICE_CHANGER_GROOT = 0x03010A00, + /** A voice like Darth Vader. + */ + VOICE_CHANGER_DARTH_VADER = 0x03010B00, + /** A rough female voice. + */ + VOICE_CHANGER_IRON_LADY = 0x03010C00, + /** A voice like Crayon Shin-chan. + */ + VOICE_CHANGER_SHIN_CHAN = 0x03010D00, + /** A voice like a castrato. + */ + VOICE_CHANGER_GIRLISH_MAN = 0x03010E00, + /** A voice like chipmunk. + */ + VOICE_CHANGER_CHIPMUNK = 0x03010F00, + +}; + +/** The options for SDK preset headphone equalizer. + */ +enum HEADPHONE_EQUALIZER_PRESET { + /** Turn off headphone EQ and use the original voice. + */ + HEADPHONE_EQUALIZER_OFF = 0x00000000, + /** For over-ear headphones. + */ + HEADPHONE_EQUALIZER_OVEREAR = 0x04000001, + /** For in-ear headphones. + */ + HEADPHONE_EQUALIZER_INEAR = 0x04000002 +}; + +/** + * Screen sharing configurations. + */ +struct ScreenCaptureParameters { + /** + * On Windows and macOS, it represents the video encoding resolution of the shared screen stream. + * See `VideoDimensions`. The default value is 1920 x 1080, that is, 2,073,600 pixels. Agora uses + * the value of this parameter to calculate the charges. + * + * If the aspect ratio is different between the encoding dimensions and screen dimensions, Agora + * applies the following algorithms for encoding. Suppose dimensions are 1920 x 1080: + * - If the value of the screen dimensions is lower than that of dimensions, for example, + * 1000 x 1000 pixels, the SDK uses 1000 x 1000 pixels for encoding. + * - If the value of the screen dimensions is higher than that of dimensions, for example, + * 2000 x 1500, the SDK uses the maximum value under dimensions with the aspect ratio of + * the screen dimension (4:3) for encoding, that is, 1440 x 1080. + */ + VideoDimensions dimensions; + /** + * On Windows and macOS, it represents the video encoding frame rate (fps) of the shared screen stream. + * The frame rate (fps) of the shared region. The default value is 5. We do not recommend setting + * this to a value greater than 15. + */ + int frameRate; + /** + * On Windows and macOS, it represents the video encoding bitrate of the shared screen stream. + * The bitrate (Kbps) of the shared region. The default value is 0 (the SDK works out a bitrate + * according to the dimensions of the current screen). + */ + int bitrate; + /** Whether to capture the mouse in screen sharing: + * - `true`: (Default) Capture the mouse. + * - `false`: Do not capture the mouse. + */ + bool captureMouseCursor; + /** + * Whether to bring the window to the front when calling the `startScreenCaptureByWindowId` method to share it: + * - `true`: Bring the window to the front. + * - `false`: (Default) Do not bring the window to the front. + */ + bool windowFocus; + /** + * A list of IDs of windows to be blocked. When calling `startScreenCaptureByDisplayId` to start screen sharing, + * you can use this parameter to block a specified window. When calling `updateScreenCaptureParameters` to update + * screen sharing configurations, you can use this parameter to dynamically block the specified windows during + * screen sharing. + */ + view_t *excludeWindowList; + /** + * The number of windows to be blocked. + */ + int excludeWindowCount; + + /** The width (px) of the border. Defaults to 0, and the value range is [0,50]. + * + */ + int highLightWidth; + /** The color of the border in RGBA format. The default value is 0xFF8CBF26. + * + */ + unsigned int highLightColor; + /** Whether to place a border around the shared window or screen: + * - true: Place a border. + * - false: (Default) Do not place a border. + * + * @note When you share a part of a window or screen, the SDK places a border around the entire window or screen if you set `enableHighLight` as true. + * + */ + bool enableHighLight; + + ScreenCaptureParameters() + : dimensions(1920, 1080), frameRate(5), bitrate(STANDARD_BITRATE), captureMouseCursor(true), windowFocus(false), excludeWindowList(OPTIONAL_NULLPTR), excludeWindowCount(0), highLightWidth(0), highLightColor(0), enableHighLight(false) {} + ScreenCaptureParameters(const VideoDimensions& d, int f, int b) + : dimensions(d), frameRate(f), bitrate(b), captureMouseCursor(true), windowFocus(false), excludeWindowList(OPTIONAL_NULLPTR), excludeWindowCount(0), highLightWidth(0), highLightColor(0), enableHighLight(false) {} + ScreenCaptureParameters(int width, int height, int f, int b) + : dimensions(width, height), frameRate(f), bitrate(b), captureMouseCursor(true), windowFocus(false), excludeWindowList(OPTIONAL_NULLPTR), excludeWindowCount(0), highLightWidth(0), highLightColor(0), enableHighLight(false){} + ScreenCaptureParameters(int width, int height, int f, int b, bool cur, bool fcs) + : dimensions(width, height), frameRate(f), bitrate(b), captureMouseCursor(cur), windowFocus(fcs), excludeWindowList(OPTIONAL_NULLPTR), excludeWindowCount(0), highLightWidth(0), highLightColor(0), enableHighLight(false) {} + ScreenCaptureParameters(int width, int height, int f, int b, view_t *ex, int cnt) + : dimensions(width, height), frameRate(f), bitrate(b), captureMouseCursor(true), windowFocus(false), excludeWindowList(ex), excludeWindowCount(cnt), highLightWidth(0), highLightColor(0), enableHighLight(false) {} + ScreenCaptureParameters(int width, int height, int f, int b, bool cur, bool fcs, view_t *ex, int cnt) + : dimensions(width, height), frameRate(f), bitrate(b), captureMouseCursor(cur), windowFocus(fcs), excludeWindowList(ex), excludeWindowCount(cnt), highLightWidth(0), highLightColor(0), enableHighLight(false) {} +}; + +/** + * Audio recording quality. + */ +enum AUDIO_RECORDING_QUALITY_TYPE { + /** + * 0: Low quality. The sample rate is 32 kHz, and the file size is around 1.2 MB after 10 minutes of recording. + */ + AUDIO_RECORDING_QUALITY_LOW = 0, + /** + * 1: Medium quality. The sample rate is 32 kHz, and the file size is around 2 MB after 10 minutes of recording. + */ + AUDIO_RECORDING_QUALITY_MEDIUM = 1, + /** + * 2: High quality. The sample rate is 32 kHz, and the file size is around 3.75 MB after 10 minutes of recording. + */ + AUDIO_RECORDING_QUALITY_HIGH = 2, + /** + * 3: Ultra high audio recording quality. + */ + AUDIO_RECORDING_QUALITY_ULTRA_HIGH = 3, +}; + +/** + * Recording content. Set in `startAudioRecording`. + */ +enum AUDIO_FILE_RECORDING_TYPE { + /** + * 1: Only records the audio of the local user. + */ + AUDIO_FILE_RECORDING_MIC = 1, + /** + * 2: Only records the audio of all remote users. + */ + AUDIO_FILE_RECORDING_PLAYBACK = 2, + /** + * 3: Records the mixed audio of the local and all remote users. + */ + AUDIO_FILE_RECORDING_MIXED = 3, +}; + +/** + * Audio encoded frame observer position. + */ +enum AUDIO_ENCODED_FRAME_OBSERVER_POSITION { + /** + * 1: Only records the audio of the local user. + */ + AUDIO_ENCODED_FRAME_OBSERVER_POSITION_RECORD = 1, + /** + * 2: Only records the audio of all remote users. + */ + AUDIO_ENCODED_FRAME_OBSERVER_POSITION_PLAYBACK = 2, + /** + * 3: Records the mixed audio of the local and all remote users. + */ + AUDIO_ENCODED_FRAME_OBSERVER_POSITION_MIXED = 3, +}; + +/** + * Recording configuration. + */ +struct AudioRecordingConfiguration { + /** + * The absolute path (including the filename extensions) of the recording file. For example: `C:\music\audio.mp4`. + * @note Ensure that the directory for the log files exists and is writable. + */ + const char* filePath; + /** + * Whether to encode the audio data: + * - `true`: Encode audio data in AAC. + * - `false`: (Default) Do not encode audio data, but save the recorded audio data directly. + */ + bool encode; + /** + * Recording sample rate (Hz). + * - 16000 + * - (Default) 32000 + * - 44100 + * - 48000 + * @note If you set this parameter to 44100 or 48000, Agora recommends recording WAV files, or AAC files with quality + * to be `AUDIO_RECORDING_QUALITY_MEDIUM` or `AUDIO_RECORDING_QUALITY_HIGH` for better recording quality. + */ + int sampleRate; + /** + * The recording content. See `AUDIO_FILE_RECORDING_TYPE`. + */ + AUDIO_FILE_RECORDING_TYPE fileRecordingType; + /** + * Recording quality. See `AUDIO_RECORDING_QUALITY_TYPE`. + * @note This parameter applies to AAC files only. + */ + AUDIO_RECORDING_QUALITY_TYPE quality; + + /** + * Recording channel. The following values are supported: + * - (Default) 1 + * - 2 + */ + int recordingChannel; + + AudioRecordingConfiguration() + : filePath(OPTIONAL_NULLPTR), + encode(false), + sampleRate(32000), + fileRecordingType(AUDIO_FILE_RECORDING_MIXED), + quality(AUDIO_RECORDING_QUALITY_LOW), + recordingChannel(1) {} + + AudioRecordingConfiguration(const char* file_path, int sample_rate, AUDIO_RECORDING_QUALITY_TYPE quality_type, int channel) + : filePath(file_path), + encode(false), + sampleRate(sample_rate), + fileRecordingType(AUDIO_FILE_RECORDING_MIXED), + quality(quality_type), + recordingChannel(channel) {} + + AudioRecordingConfiguration(const char* file_path, bool enc, int sample_rate, AUDIO_FILE_RECORDING_TYPE type, AUDIO_RECORDING_QUALITY_TYPE quality_type, int channel) + : filePath(file_path), + encode(enc), + sampleRate(sample_rate), + fileRecordingType(type), + quality(quality_type), + recordingChannel(channel) {} + + AudioRecordingConfiguration(const AudioRecordingConfiguration &rhs) + : filePath(rhs.filePath), + encode(rhs.encode), + sampleRate(rhs.sampleRate), + fileRecordingType(rhs.fileRecordingType), + quality(rhs.quality), + recordingChannel(rhs.recordingChannel) {} +}; + +/** + * Observer settings for the encoded audio. + */ +struct AudioEncodedFrameObserverConfig { + /** + * Audio profile. For details, see `AUDIO_ENCODED_FRAME_OBSERVER_POSITION`. + */ + AUDIO_ENCODED_FRAME_OBSERVER_POSITION postionType; + /** + * Audio encoding type. For details, see `AUDIO_ENCODING_TYPE`. + */ + AUDIO_ENCODING_TYPE encodingType; + + AudioEncodedFrameObserverConfig() + : postionType(AUDIO_ENCODED_FRAME_OBSERVER_POSITION_PLAYBACK), + encodingType(AUDIO_ENCODING_TYPE_OPUS_48000_MEDIUM){} + +}; +/** + * The encoded audio observer. + */ +class IAudioEncodedFrameObserver { +public: +/** +* Gets the encoded audio data of the local user. +* +* After calling `registerAudioEncodedFrameObserver` and setting the encoded audio as `AUDIO_ENCODED_FRAME_OBSERVER_POSITION_RECORD`, +* you can get the encoded audio data of the local user from this callback. +* +* @param frameBuffer The pointer to the audio frame buffer. +* @param length The data length (byte) of the audio frame. +* @param audioEncodedFrameInfo Audio information after encoding. For details, see `EncodedAudioFrameInfo`. +*/ +virtual void onRecordAudioEncodedFrame(const uint8_t* frameBuffer, int length, const EncodedAudioFrameInfo& audioEncodedFrameInfo) = 0; + +/** +* Gets the encoded audio data of all remote users. +* +* After calling `registerAudioEncodedFrameObserver` and setting the encoded audio as `AUDIO_ENCODED_FRAME_OBSERVER_POSITION_PLAYBACK`, +* you can get encoded audio data of all remote users through this callback. +* +* @param frameBuffer The pointer to the audio frame buffer. +* @param length The data length (byte) of the audio frame. +* @param audioEncodedFrameInfo Audio information after encoding. For details, see `EncodedAudioFrameInfo`. +*/ +virtual void onPlaybackAudioEncodedFrame(const uint8_t* frameBuffer, int length, const EncodedAudioFrameInfo& audioEncodedFrameInfo) = 0; + +/** +* Gets the mixed and encoded audio data of the local and all remote users. +* +* After calling `registerAudioEncodedFrameObserver` and setting the audio profile as `AUDIO_ENCODED_FRAME_OBSERVER_POSITION_MIXED`, +* you can get the mixed and encoded audio data of the local and all remote users through this callback. +* +* @param frameBuffer The pointer to the audio frame buffer. +* @param length The data length (byte) of the audio frame. +* @param audioEncodedFrameInfo Audio information after encoding. For details, see `EncodedAudioFrameInfo`. +*/ +virtual void onMixedAudioEncodedFrame(const uint8_t* frameBuffer, int length, const EncodedAudioFrameInfo& audioEncodedFrameInfo) = 0; + +virtual ~IAudioEncodedFrameObserver () {} +}; + +/** The region for connection, which is the region where the server the SDK connects to is located. + */ +enum AREA_CODE { + /** + * Mainland China. + */ + AREA_CODE_CN = 0x00000001, + /** + * North America. + */ + AREA_CODE_NA = 0x00000002, + /** + * Europe. + */ + AREA_CODE_EU = 0x00000004, + /** + * Asia, excluding Mainland China. + */ + AREA_CODE_AS = 0x00000008, + /** + * Japan. + */ + AREA_CODE_JP = 0x00000010, + /** + * India. + */ + AREA_CODE_IN = 0x00000020, + /** + * (Default) Global. + */ + AREA_CODE_GLOB = (0xFFFFFFFF) +}; + +enum AREA_CODE_EX { + /** + * Oceania + */ + AREA_CODE_OC = 0x00000040, + /** + * South-American + */ + AREA_CODE_SA = 0x00000080, + /** + * Africa + */ + AREA_CODE_AF = 0x00000100, + /** + * South Korea + */ + AREA_CODE_KR = 0x00000200, + /** + * Hong Kong and Macou + */ + AREA_CODE_HKMC = 0x00000400, + /** + * United States + */ + AREA_CODE_US = 0x00000800, + /** + * The global area (except China) + */ + AREA_CODE_OVS = 0xFFFFFFFE +}; + +/** + * The error code of the channel media replay. + */ +enum CHANNEL_MEDIA_RELAY_ERROR { + /** 0: No error. + */ + RELAY_OK = 0, + /** 1: An error occurs in the server response. + */ + RELAY_ERROR_SERVER_ERROR_RESPONSE = 1, + /** 2: No server response. You can call the `leaveChannel` method to leave the channel. + * + * This error can also occur if your project has not enabled co-host token authentication. You can contact technical + * support to enable the service for cohosting across channels before starting a channel media relay. + */ + RELAY_ERROR_SERVER_NO_RESPONSE = 2, + /** 3: The SDK fails to access the service, probably due to limited resources of the server. + */ + RELAY_ERROR_NO_RESOURCE_AVAILABLE = 3, + /** 4: Fails to send the relay request. + */ + RELAY_ERROR_FAILED_JOIN_SRC = 4, + /** 5: Fails to accept the relay request. + */ + RELAY_ERROR_FAILED_JOIN_DEST = 5, + /** 6: The server fails to receive the media stream. + */ + RELAY_ERROR_FAILED_PACKET_RECEIVED_FROM_SRC = 6, + /** 7: The server fails to send the media stream. + */ + RELAY_ERROR_FAILED_PACKET_SENT_TO_DEST = 7, + /** 8: The SDK disconnects from the server due to poor network connections. You can call the `leaveChannel` method to + * leave the channel. + */ + RELAY_ERROR_SERVER_CONNECTION_LOST = 8, + /** 9: An internal error occurs in the server. + */ + RELAY_ERROR_INTERNAL_ERROR = 9, + /** 10: The token of the source channel has expired. + */ + RELAY_ERROR_SRC_TOKEN_EXPIRED = 10, + /** 11: The token of the destination channel has expired. + */ + RELAY_ERROR_DEST_TOKEN_EXPIRED = 11, +}; + +/** + * The state code of the channel media relay. + */ +enum CHANNEL_MEDIA_RELAY_STATE { + /** 0: The initial state. After you successfully stop the channel media relay by calling `stopChannelMediaRelay`, + * the `onChannelMediaRelayStateChanged` callback returns this state. + */ + RELAY_STATE_IDLE = 0, + /** 1: The SDK tries to relay the media stream to the destination channel. + */ + RELAY_STATE_CONNECTING = 1, + /** 2: The SDK successfully relays the media stream to the destination channel. + */ + RELAY_STATE_RUNNING = 2, + /** 3: An error occurs. See `code` in `onChannelMediaRelayStateChanged` for the error code. + */ + RELAY_STATE_FAILURE = 3, +}; + +/** The definition of ChannelMediaInfo. + */ +struct ChannelMediaInfo { + /** The user ID. + */ + uid_t uid; + /** The channel name. The default value is NULL, which means that the SDK + * applies the current channel name. + */ + const char* channelName; + /** The token that enables the user to join the channel. The default value + * is NULL, which means that the SDK applies the current token. + */ + const char* token; + + ChannelMediaInfo() : uid(0), channelName(NULL), token(NULL) {} + ChannelMediaInfo(const char* c, const char* t, uid_t u) : uid(u), channelName(c), token(t) {} +}; + +/** The definition of ChannelMediaRelayConfiguration. + */ +struct ChannelMediaRelayConfiguration { + /** The information of the source channel `ChannelMediaInfo`. It contains the following members: + * - `channelName`: The name of the source channel. The default value is `NULL`, which means the SDK applies the name + * of the current channel. + * - `uid`: The unique ID to identify the relay stream in the source channel. The default value is 0, which means the + * SDK generates a random UID. You must set it as 0. + * - `token`: The token for joining the source channel. It is generated with the `channelName` and `uid` you set in + * `srcInfo`. + * - If you have not enabled the App Certificate, set this parameter as the default value `NULL`, which means the + * SDK applies the App ID. + * - If you have enabled the App Certificate, you must use the token generated with the `channelName` and `uid`, and + * the `uid` must be set as 0. + */ + ChannelMediaInfo* srcInfo; + /** The information of the destination channel `ChannelMediaInfo`. It contains the following members: + * - `channelName`: The name of the destination channel. + * - `uid`: The unique ID to identify the relay stream in the destination channel. The value + * ranges from 0 to (2^32-1). To avoid UID conflicts, this `UID` must be different from any + * other `UID` in the destination channel. The default value is 0, which means the SDK generates + * a random `UID`. Do not set this parameter as the `UID` of the host in the destination channel, + * and ensure that this `UID` is different from any other `UID` in the channel. + * - `token`: The token for joining the destination channel. It is generated with the `channelName` + * and `uid` you set in `destInfos`. + * - If you have not enabled the App Certificate, set this parameter as the default value NULL, + * which means the SDK applies the App ID. + * If you have enabled the App Certificate, you must use the token generated with the `channelName` + * and `uid`. + */ + ChannelMediaInfo* destInfos; + /** The number of destination channels. The default value is 0, and the value range is from 0 to + * 6. Ensure that the value of this parameter corresponds to the number of `ChannelMediaInfo` + * structs you define in `destInfo`. + */ + int destCount; + + ChannelMediaRelayConfiguration() : srcInfo(OPTIONAL_NULLPTR), destInfos(OPTIONAL_NULLPTR), destCount(0) {} +}; + +/** + * The uplink network information. + */ +struct UplinkNetworkInfo { + /** + * The target video encoder bitrate (bps). + */ + int video_encoder_target_bitrate_bps; + + UplinkNetworkInfo() : video_encoder_target_bitrate_bps(0) {} + + bool operator==(const UplinkNetworkInfo& rhs) const { + return (video_encoder_target_bitrate_bps == rhs.video_encoder_target_bitrate_bps); + } +}; + +struct DownlinkNetworkInfo { + struct PeerDownlinkInfo { + /** + * The ID of the user who owns the remote video stream. + */ + const char* userId; + /** + * The remote video stream type: #VIDEO_STREAM_TYPE. + */ + VIDEO_STREAM_TYPE stream_type; + /** + * The remote video downscale type: #REMOTE_VIDEO_DOWNSCALE_LEVEL. + */ + REMOTE_VIDEO_DOWNSCALE_LEVEL current_downscale_level; + /** + * The expected bitrate in bps. + */ + int expected_bitrate_bps; + + PeerDownlinkInfo() + : userId(OPTIONAL_NULLPTR), + stream_type(VIDEO_STREAM_HIGH), + current_downscale_level(REMOTE_VIDEO_DOWNSCALE_LEVEL_NONE), + expected_bitrate_bps(-1) {} + + PeerDownlinkInfo(const PeerDownlinkInfo& rhs) + : stream_type(rhs.stream_type), + current_downscale_level(rhs.current_downscale_level), + expected_bitrate_bps(rhs.expected_bitrate_bps) { + if (rhs.userId != OPTIONAL_NULLPTR) { + const int len = std::strlen(rhs.userId); + char* buf = new char[len + 1]; + std::memcpy(buf, rhs.userId, len); + buf[len] = '\0'; + userId = buf; + } + } + + PeerDownlinkInfo& operator=(const PeerDownlinkInfo& rhs) { + if (this == &rhs) return *this; + userId = OPTIONAL_NULLPTR; + stream_type = rhs.stream_type; + current_downscale_level = rhs.current_downscale_level; + expected_bitrate_bps = rhs.expected_bitrate_bps; + if (rhs.userId != OPTIONAL_NULLPTR) { + const int len = std::strlen(rhs.userId); + char* buf = new char[len + 1]; + std::memcpy(buf, rhs.userId, len); + buf[len] = '\0'; + userId = buf; + } + return *this; + } + + ~PeerDownlinkInfo() { delete[] userId; } + }; + + /** + * The lastmile buffer delay queue time in ms. + */ + int lastmile_buffer_delay_time_ms; + /** + * The current downlink bandwidth estimation(bps) after downscale. + */ + int bandwidth_estimation_bps; + /** + * The total video downscale level count. + */ + int total_downscale_level_count; + /** + * The peer video downlink info array. + */ + PeerDownlinkInfo* peer_downlink_info; + /** + * The total video received count. + */ + int total_received_video_count; + + DownlinkNetworkInfo() + : lastmile_buffer_delay_time_ms(-1), + bandwidth_estimation_bps(-1), + total_downscale_level_count(-1), + peer_downlink_info(OPTIONAL_NULLPTR), + total_received_video_count(-1) {} + + DownlinkNetworkInfo(const DownlinkNetworkInfo& info) + : lastmile_buffer_delay_time_ms(info.lastmile_buffer_delay_time_ms), + bandwidth_estimation_bps(info.bandwidth_estimation_bps), + total_downscale_level_count(info.total_downscale_level_count), + peer_downlink_info(OPTIONAL_NULLPTR), + total_received_video_count(info.total_received_video_count) { + if (total_received_video_count <= 0) return; + peer_downlink_info = new PeerDownlinkInfo[total_received_video_count]; + for (int i = 0; i < total_received_video_count; ++i) + peer_downlink_info[i] = info.peer_downlink_info[i]; + } + + DownlinkNetworkInfo& operator=(const DownlinkNetworkInfo& rhs) { + if (this == &rhs) return *this; + lastmile_buffer_delay_time_ms = rhs.lastmile_buffer_delay_time_ms; + bandwidth_estimation_bps = rhs.bandwidth_estimation_bps; + total_downscale_level_count = rhs.total_downscale_level_count; + peer_downlink_info = OPTIONAL_NULLPTR; + total_received_video_count = rhs.total_received_video_count; + if (total_received_video_count > 0) { + peer_downlink_info = new PeerDownlinkInfo[total_received_video_count]; + for (int i = 0; i < total_received_video_count; ++i) + peer_downlink_info[i] = rhs.peer_downlink_info[i]; + } + return *this; + } + + ~DownlinkNetworkInfo() { delete[] peer_downlink_info; } +}; + +/** + * The built-in encryption mode. + * + * Agora recommends using AES_128_GCM2 or AES_256_GCM2 encrypted mode. These two modes support the + * use of salt for higher security. + */ +enum ENCRYPTION_MODE { + /** 1: 128-bit AES encryption, XTS mode. + */ + AES_128_XTS = 1, + /** 2: 128-bit AES encryption, ECB mode. + */ + AES_128_ECB = 2, + /** 3: 256-bit AES encryption, XTS mode. + */ + AES_256_XTS = 3, + /** 4: 128-bit SM4 encryption, ECB mode. + */ + SM4_128_ECB = 4, + /** 5: 128-bit AES encryption, GCM mode. + */ + AES_128_GCM = 5, + /** 6: 256-bit AES encryption, GCM mode. + */ + AES_256_GCM = 6, + /** 7: (Default) 128-bit AES encryption, GCM mode. This encryption mode requires the setting of + * salt (`encryptionKdfSalt`). + */ + AES_128_GCM2 = 7, + /** 8: 256-bit AES encryption, GCM mode. This encryption mode requires the setting of salt (`encryptionKdfSalt`). + */ + AES_256_GCM2 = 8, + /** Enumerator boundary. + */ + MODE_END, +}; + +/** Built-in encryption configurations. */ +struct EncryptionConfig { + /** + * The built-in encryption mode. See #ENCRYPTION_MODE. Agora recommends using `AES_128_GCM2` + * or `AES_256_GCM2` encrypted mode. These two modes support the use of salt for higher security. + */ + ENCRYPTION_MODE encryptionMode; + /** + * Encryption key in string type with unlimited length. Agora recommends using a 32-byte key. + * + * @note If you do not set an encryption key or set it as NULL, you cannot use the built-in encryption, and the SDK returns #ERR_INVALID_ARGUMENT (-2). + */ + const char* encryptionKey; + /** + * Salt, 32 bytes in length. Agora recommends that you use OpenSSL to generate salt on the server side. + * + * @note This parameter takes effect only in `AES_128_GCM2` or `AES_256_GCM2` encrypted mode. + * In this case, ensure that this parameter is not 0. + */ + uint8_t encryptionKdfSalt[32]; + + EncryptionConfig() + : encryptionMode(AES_128_GCM2), + encryptionKey(OPTIONAL_NULLPTR) + { + memset(encryptionKdfSalt, 0, sizeof(encryptionKdfSalt)); + } + + /// @cond + const char* getEncryptionString() const { + switch(encryptionMode) { + case AES_128_XTS: + return "aes-128-xts"; + case AES_128_ECB: + return "aes-128-ecb"; + case AES_256_XTS: + return "aes-256-xts"; + case SM4_128_ECB: + return "sm4-128-ecb"; + case AES_128_GCM: + return "aes-128-gcm"; + case AES_256_GCM: + return "aes-256-gcm"; + case AES_128_GCM2: + return "aes-128-gcm-2"; + case AES_256_GCM2: + return "aes-256-gcm-2"; + default: + return "aes-128-gcm-2"; + } + return "aes-128-gcm-2"; + } + /// @endcond +}; + +/** Encryption error type. + */ +enum ENCRYPTION_ERROR_TYPE { + /** + * 0: Internal reason. + */ + ENCRYPTION_ERROR_INTERNAL_FAILURE = 0, + /** + * 1: Decryption errors. Ensure that the receiver and the sender use the same encryption mode and key. + */ + ENCRYPTION_ERROR_DECRYPTION_FAILURE = 1, + /** + * 2: Encryption errors. + */ + ENCRYPTION_ERROR_ENCRYPTION_FAILURE = 2, +}; + +enum UPLOAD_ERROR_REASON +{ + UPLOAD_SUCCESS = 0, + UPLOAD_NET_ERROR = 1, + UPLOAD_SERVER_ERROR = 2, +}; + +/** The type of the device permission. + */ +enum PERMISSION_TYPE { + /** + * 0: Permission for the audio capture device. + */ + RECORD_AUDIO = 0, + /** + * 1: Permission for the camera. + */ + CAMERA = 1, + + SCREEN_CAPTURE = 2, +}; + +/** + * The subscribing state. + */ +enum STREAM_SUBSCRIBE_STATE { + /** + * 0: The initial subscribing state after joining the channel. + */ + SUB_STATE_IDLE = 0, + /** + * 1: Fails to subscribe to the remote stream. Possible reasons: + * - The remote user: + * - Calls `muteLocalAudioStream(true)` or `muteLocalVideoStream(true)` to stop sending local + * media stream. + * - Calls `disableAudio` or `disableVideo `to disable the local audio or video module. + * - Calls `enableLocalAudio(false)` or `enableLocalVideo(false)` to disable the local audio or video capture. + * - The role of the remote user is audience. + * - The local user calls the following methods to stop receiving remote streams: + * - Calls `muteRemoteAudioStream(true)`, `muteAllRemoteAudioStreams(true)` or `setDefaultMuteAllRemoteAudioStreams(true)` to stop receiving the remote audio streams. + * - Calls `muteRemoteVideoStream(true)`, `muteAllRemoteVideoStreams(true)` or `setDefaultMuteAllRemoteVideoStreams(true)` to stop receiving the remote video streams. + */ + SUB_STATE_NO_SUBSCRIBED = 1, + /** + * 2: Subscribing. + */ + SUB_STATE_SUBSCRIBING = 2, + /** + * 3: Subscribes to and receives the remote stream successfully. + */ + SUB_STATE_SUBSCRIBED = 3 +}; + +/** + * The publishing state. + */ +enum STREAM_PUBLISH_STATE { + /** + * 0: The initial publishing state after joining the channel. + */ + PUB_STATE_IDLE = 0, + /** + * 1: Fails to publish the local stream. Possible reasons: + * - The local user calls `muteLocalAudioStream(true)` or `muteLocalVideoStream(true)` to stop sending the local media stream. + * - The local user calls `disableAudio` or `disableVideo` to disable the local audio or video module. + * - The local user calls `enableLocalAudio(false)` or `enableLocalVideo(false)` to disable the local audio or video capture. + * - The role of the local user is audience. + */ + PUB_STATE_NO_PUBLISHED = 1, + /** + * 2: Publishing. + */ + PUB_STATE_PUBLISHING = 2, + /** + * 3: Publishes successfully. + */ + PUB_STATE_PUBLISHED = 3 +}; + +/** + * The EchoTestConfiguration struct. + */ +struct EchoTestConfiguration { + view_t view; + bool enableAudio; + bool enableVideo; + const char* token; + const char* channelId; + int intervalInSeconds; + + EchoTestConfiguration(view_t v, bool ea, bool ev, const char* t, const char* c, const int is) + : view(v), enableAudio(ea), enableVideo(ev), token(t), channelId(c), intervalInSeconds(is) {} + + EchoTestConfiguration() + : view(OPTIONAL_NULLPTR), enableAudio(true), enableVideo(true), token(OPTIONAL_NULLPTR), channelId(OPTIONAL_NULLPTR), intervalInSeconds(2) {} +}; + +/** + * The information of the user. + */ +struct UserInfo { + /** + * The user ID. + */ + uid_t uid; + /** + * The user account. The maximum data length is `MAX_USER_ACCOUNT_LENGTH_TYPE`. + */ + char userAccount[MAX_USER_ACCOUNT_LENGTH]; + + UserInfo() : uid(0) { + userAccount[0] = '\0'; + } +}; + +/** + * The audio filter of in-ear monitoring. + */ +enum EAR_MONITORING_FILTER_TYPE { + /** + * 1: Do not add an audio filter to the in-ear monitor. + */ + EAR_MONITORING_FILTER_NONE = (1<<0), + /** + * 2: Enable audio filters to the in-ear monitor. If you implement functions such as voice + * beautifier and audio effect, users can hear the voice after adding these effects. + */ + EAR_MONITORING_FILTER_BUILT_IN_AUDIO_FILTERS = (1<<1), + /** + * 4: Enable noise suppression to the in-ear monitor. + */ + EAR_MONITORING_FILTER_NOISE_SUPPRESSION = (1<<2) +}; + +/** + * Thread priority type. + */ +enum THREAD_PRIORITY_TYPE { + /** + * 0: Lowest priority. + */ + LOWEST = 0, + /** + * 1: Low priority. + */ + LOW = 1, + /** + * 2: Normal priority. + */ + NORMAL = 2, + /** + * 3: High priority. + */ + HIGH = 3, + /** + * 4. Highest priority. + */ + HIGHEST = 4, + /** + * 5. Critical priority. + */ + CRITICAL = 5, +}; + +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IOS) + +/** + * The video configuration for the shared screen stream. + */ +struct ScreenVideoParameters { + /** + * The dimensions of the video encoding resolution. The default value is `1280` x `720`. + * For recommended values, see [Recommended video + * profiles](https://docs.agora.io/en/Interactive%20Broadcast/game_streaming_video_profile?platform=Android#recommended-video-profiles). + * If the aspect ratio is different between width and height and the screen, the SDK adjusts the + * video encoding resolution according to the following rules (using an example where `width` 脳 + * `height` is 1280 脳 720): + * - When the width and height of the screen are both lower than `width` and `height`, the SDK + * uses the resolution of the screen for video encoding. For example, if the screen is 640 脳 + * 360, The SDK uses 640 脳 360 for video encoding. + * - When either the width or height of the screen is higher than `width` or `height`, the SDK + * uses the maximum values that do not exceed those of `width` and `height` while maintaining + * the aspect ratio of the screen for video encoding. For example, if the screen is 2000 脳 1500, + * the SDK uses 960 脳 720 for video encoding. + * + * @note + * - The billing of the screen sharing stream is based on the values of width and height. + * When you do not pass in these values, Agora bills you at 1280 脳 720; + * when you pass in these values, Agora bills you at those values. + * For details, see [Pricing for Real-time + * Communication](https://docs.agora.io/en/Interactive%20Broadcast/billing_rtc). + * - This value does not indicate the orientation mode of the output ratio. + * For how to set the video orientation, see `ORIENTATION_MODE`. + * - Whether the SDK can support a resolution at 720P depends on the performance of the device. + * If you set 720P but the device cannot support it, the video frame rate can be lower. + */ + VideoDimensions dimensions; + /** + * The video encoding frame rate (fps). The default value is `15`. + * For recommended values, see [Recommended video + * profiles](https://docs.agora.io/en/Interactive%20Broadcast/game_streaming_video_profile?platform=Android#recommended-video-profiles). + */ + int frameRate = 15; + /** + * The video encoding bitrate (Kbps). For recommended values, see [Recommended video + * profiles](https://docs.agora.io/en/Interactive%20Broadcast/game_streaming_video_profile?platform=Android#recommended-video-profiles). + */ + int bitrate; + /* + * The content hint of the screen sharing: + */ + VIDEO_CONTENT_HINT contentHint = VIDEO_CONTENT_HINT::CONTENT_HINT_MOTION; + + ScreenVideoParameters() : dimensions(1280, 720) {} +}; + +/** + * The audio configuration for the shared screen stream. + */ +struct ScreenAudioParameters { + /** + * The audio sample rate (Hz). The default value is `16000`. + */ + int sampleRate = 16000; + /** + * The number of audio channels. The default value is `2`, indicating dual channels. + */ + int channels = 2; + /** + * The volume of the captured system audio. The value range is [0,100]. The default value is + * `100`. + */ + int captureSignalVolume = 100; +}; + +/** + * The configuration of the screen sharing + */ +struct ScreenCaptureParameters2 { + /** + * Determines whether to capture system audio during screen sharing: + * - `true`: Capture. + * - `false`: (Default) Do not capture. + * + * **Note** + * Due to system limitations, capturing system audio is only available for Android API level 29 + * and later (that is, Android 10 and later). + */ + bool captureAudio = false; + /** + * The audio configuration for the shared screen stream. + */ + ScreenAudioParameters audioParams; + /** + * Determines whether to capture the screen during screen sharing: + * - `true`: (Default) Capture. + * - `false`: Do not capture. + * + * **Note** + * Due to system limitations, screen capture is only available for Android API level 21 and later + * (that is, Android 5 and later). + */ + bool captureVideo = true; + /** + * The video configuration for the shared screen stream. + */ + ScreenVideoParameters videoParams; +}; +#endif + +/** + * The tracing event of media rendering. + */ +enum MEDIA_TRACE_EVENT { + /** + * 0: The media frame has been rendered. + */ + MEDIA_TRACE_EVENT_VIDEO_RENDERED = 0, + /** + * 1: The media frame has been decoded. + */ + MEDIA_TRACE_EVENT_VIDEO_DECODED, +}; + +/** + * The video rendering tracing result + */ +struct VideoRenderingTracingInfo { + /** + * Elapsed time from the start tracing time to the time when the tracing event occurred. + */ + int elapsedTime; + /** + * Elapsed time from the start tracing time to the time when join channel. + * + * **Note** + * If the start tracing time is behind the time when join channel, this value will be negative. + */ + int start2JoinChannel; + /** + * Elapsed time from joining channel to finishing joining channel. + */ + int join2JoinSuccess; + /** + * Elapsed time from finishing joining channel to remote user joined. + * + * **Note** + * If the start tracing time is after the time finishing join channel, this value will be + * the elapsed time from the start tracing time to remote user joined. The minimum value is 0. + */ + int joinSuccess2RemoteJoined; + /** + * Elapsed time from remote user joined to set the view. + * + * **Note** + * If the start tracing time is after the time when remote user joined, this value will be + * the elapsed time from the start tracing time to set the view. The minimum value is 0. + */ + int remoteJoined2SetView; + /** + * Elapsed time from remote user joined to the time subscribing remote video stream. + * + * **Note** + * If the start tracing time is after the time when remote user joined, this value will be + * the elapsed time from the start tracing time to the time subscribing remote video stream. + * The minimum value is 0. + */ + int remoteJoined2UnmuteVideo; + /** + * Elapsed time from remote user joined to the remote video packet received. + * + * **Note** + * If the start tracing time is after the time when remote user joined, this value will be + * the elapsed time from the start tracing time to the time subscribing remote video stream. + * The minimum value is 0. + */ + int remoteJoined2PacketReceived; +}; + +enum CONFIG_FETCH_TYPE { + /** + * 1: Fetch config when initializing RtcEngine, without channel info. + */ + CONFIG_FETCH_TYPE_INITIALIZE = 1, + /** + * 2: Fetch config when joining channel with channel info, such as channel name and uid. + */ + CONFIG_FETCH_TYPE_JOIN_CHANNEL = 2, +}; + + +/** The local proxy mode type. */ +enum LOCAL_PROXY_MODE { + /** 0: Connect local proxy with high priority, if not connected to local proxy, fallback to sdrtn. + */ + ConnectivityFirst = 0, + /** 1: Only connect local proxy + */ + LocalOnly = 1, +}; + +struct LogUploadServerInfo { + /** Log upload server domain + */ + const char* serverDomain; + /** Log upload server path + */ + const char* serverPath; + /** Log upload server port + */ + int serverPort; + /** Whether to use HTTPS request: + - true: Use HTTPS request + - fasle: Use HTTP request + */ + bool serverHttps; + + LogUploadServerInfo() : serverDomain(NULL), serverPath(NULL), serverPort(0), serverHttps(true) {} + + LogUploadServerInfo(const char* domain, const char* path, int port, bool https) : serverDomain(domain), serverPath(path), serverPort(port), serverHttps(https) {} +}; + +struct AdvancedConfigInfo { + /** Log upload server + */ + LogUploadServerInfo logUploadServer; +}; + +struct LocalAccessPointConfiguration { + /** Local access point IP address list. + */ + const char** ipList; + /** The number of local access point IP address. + */ + int ipListSize; + /** Local access point domain list. + */ + const char** domainList; + /** The number of local access point domain. + */ + int domainListSize; + /** Certificate domain name installed on specific local access point. pass "" means using sni domain on specific local access point + * SNI(Server Name Indication) is an extension to the TLS protocol. + */ + const char* verifyDomainName; + /** Local proxy connection mode, connectivity first or local only. + */ + LOCAL_PROXY_MODE mode; + /** Local proxy connection, advanced Config info. + */ + AdvancedConfigInfo advancedConfig; + LocalAccessPointConfiguration() : ipList(NULL), ipListSize(0), domainList(NULL), domainListSize(0), verifyDomainName(NULL), mode(ConnectivityFirst) {} +}; + +/** + * The information about recorded media streams. + */ +struct RecorderStreamInfo { + const char* channelId; + /** + * The user ID. + */ + uid_t uid; + /** + * The channel ID of the audio/video stream needs to be recorded. + */ + RecorderStreamInfo() : channelId(NULL), uid(0) {} + RecorderStreamInfo(const char* channelId, uid_t uid) : channelId(channelId), uid(uid) {} +}; +} // namespace rtc + +namespace base { + +class IEngineBase { + public: + virtual int queryInterface(rtc::INTERFACE_ID_TYPE iid, void** inter) = 0; + virtual ~IEngineBase() {} +}; + +class AParameter : public agora::util::AutoPtr { + public: + AParameter(IEngineBase& engine) { initialize(&engine); } + AParameter(IEngineBase* engine) { initialize(engine); } + AParameter(IAgoraParameter* p) : agora::util::AutoPtr(p) {} + + private: + bool initialize(IEngineBase* engine) { + IAgoraParameter* p = OPTIONAL_NULLPTR; + if (engine && !engine->queryInterface(rtc::AGORA_IID_PARAMETER_ENGINE, (void**)&p)) reset(p); + return p != OPTIONAL_NULLPTR; + } +}; + +class LicenseCallback { + public: + virtual ~LicenseCallback() {} + virtual void onCertificateRequired() = 0; + virtual void onLicenseRequest() = 0; + virtual void onLicenseValidated() = 0; + virtual void onLicenseError(int result) = 0; +}; + +} // namespace base + +/** + * Spatial audio parameters + */ +struct SpatialAudioParams { + /** + * Speaker azimuth in a spherical coordinate system centered on the listener. + */ + Optional speaker_azimuth; + /** + * Speaker elevation in a spherical coordinate system centered on the listener. + */ + Optional speaker_elevation; + /** + * Distance between speaker and listener. + */ + Optional speaker_distance; + /** + * Speaker orientation [0-180], 0 degree is the same with listener orientation. + */ + Optional speaker_orientation; + /** + * Enable blur or not for the speaker. + */ + Optional enable_blur; + /** + * Enable air absorb or not for the speaker. + */ + Optional enable_air_absorb; + /** + * Speaker attenuation factor. + */ + Optional speaker_attenuation; + /** + * Enable doppler factor. + */ + Optional enable_doppler; +}; +/** + * Layout info of video stream which compose a transcoder video stream. +*/ +struct VideoLayout +{ + /** + * Channel Id from which this video stream come from. + */ + const char* channelId; + /** + * User id of video stream. + */ + rtc::uid_t uid; + /** + * User account of video stream. + */ + user_id_t strUid; + /** + * x coordinate of video stream on a transcoded video stream canvas. + */ + uint32_t x; + /** + * y coordinate of video stream on a transcoded video stream canvas. + */ + uint32_t y; + /** + * width of video stream on a transcoded video stream canvas. + */ + uint32_t width; + /** + * height of video stream on a transcoded video stream canvas. + */ + uint32_t height; + /** + * video state of video stream on a transcoded video stream canvas. + * 0 for normal video , 1 for placeholder image showed , 2 for black image. + */ + uint32_t videoState; + + VideoLayout() : channelId(OPTIONAL_NULLPTR), uid(0), strUid(OPTIONAL_NULLPTR), x(0), y(0), width(0), height(0), videoState(0) {} +}; +} // namespace agora + +/** + * Gets the version of the SDK. + * @param [out] build The build number of Agora SDK. + * @return The string of the version of the SDK. + */ +AGORA_API const char* AGORA_CALL getAgoraSdkVersion(int* build); + +/** + * Gets error description of an error code. + * @param [in] err The error code. + * @return The description of the error code. + */ +AGORA_API const char* AGORA_CALL getAgoraSdkErrorDescription(int err); + +AGORA_API int AGORA_CALL setAgoraSdkExternalSymbolLoader(void* (*func)(const char* symname)); + +/** + * Generate credential + * @param [in, out] credential The content of the credential. + * @return The description of the error code. + * @note For license only, everytime will generate a different credential. + * So, just need to call once for a device, and then save the credential + */ +AGORA_API int AGORA_CALL createAgoraCredential(agora::util::AString &credential); + +/** + * Verify given certificate and return the result + * When you receive onCertificateRequired event, you must validate the certificate by calling + * this function. This is sync call, and if validation is success, it will return ERR_OK. And + * if failed to pass validation, you won't be able to joinChannel and ERR_CERT_FAIL will be + * returned. + * @param [in] credential_buf pointer to the credential's content. + * @param [in] credential_len the length of the credential's content. + * @param [in] certificate_buf pointer to the certificate's content. + * @param [in] certificate_len the length of the certificate's content. + * @return The description of the error code. + * @note For license only. + */ +AGORA_API int AGORA_CALL getAgoraCertificateVerifyResult(const char *credential_buf, int credential_len, + const char *certificate_buf, int certificate_len); + +/** + * @brief Implement the agora::base::LicenseCallback, + * create a LicenseCallback object to receive callbacks of license. + * + * @param [in] callback The object of agora::LiceseCallback, + * set the callback to null before delete it. + */ +AGORA_API void setAgoraLicenseCallback(agora::base::LicenseCallback *callback); + +/** + * @brief Get the LicenseCallback pointer if already setup, + * otherwise, return null. + * + * @return a pointer of agora::base::LicenseCallback + */ + +AGORA_API agora::base::LicenseCallback* getAgoraLicenseCallback(); + +/* + * Get monotonic time in ms which can be used by capture time, + * typical scenario is as follows: + * + * ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + * | // custom audio/video base capture time, e.g. the first audio/video capture time. | + * | int64_t custom_capture_time_base; | + * | | + * | int64_t agora_monotonic_time = getAgoraCurrentMonotonicTimeInMs(); | + * | | + * | // offset is fixed once calculated in the begining. | + * | const int64_t offset = agora_monotonic_time - custom_capture_time_base; | + * | | + * | // realtime_custom_audio/video_capture_time is the origin capture time that customer provided.| + * | // actual_audio/video_capture_time is the actual capture time transfered to sdk. | + * | int64_t actual_audio_capture_time = realtime_custom_audio_capture_time + offset; | + * | int64_t actual_video_capture_time = realtime_custom_video_capture_time + offset; | + * ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + * + * @return + * - >= 0: Success. + * - < 0: Failure. + */ +AGORA_API int64_t AGORA_CALL getAgoraCurrentMonotonicTimeInMs(); diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaBase.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaBase.h new file mode 100644 index 000000000..15dfd4b38 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaBase.h @@ -0,0 +1,1671 @@ +// Agora Engine SDK +// +// Created by Sting Feng in 2017-11. +// Copyright (c) 2017 Agora.io. All rights reserved. + +#pragma once // NOLINT(build/header_guard) + +#include +#include +#include +#include + +#ifndef OPTIONAL_ENUM_SIZE_T +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#define OPTIONAL_ENUM_SIZE_T enum : size_t +#else +#define OPTIONAL_ENUM_SIZE_T enum +#endif +#endif + +#if !defined(__APPLE__) +#define __deprecated +#endif + +namespace agora { +namespace rtc { + +typedef unsigned int uid_t; +typedef unsigned int track_id_t; +typedef unsigned int conn_id_t; +typedef unsigned int video_track_id_t; + +static const unsigned int INVALID_TRACK_ID = 0xffffffff; +static const unsigned int DEFAULT_CONNECTION_ID = 0; +static const unsigned int DUMMY_CONNECTION_ID = (std::numeric_limits::max)(); + +struct EncodedVideoFrameInfo; + +/** +* Video source types definition. +**/ +enum VIDEO_SOURCE_TYPE { + /** Video captured by the camera. + */ + VIDEO_SOURCE_CAMERA_PRIMARY = 0, + VIDEO_SOURCE_CAMERA = VIDEO_SOURCE_CAMERA_PRIMARY, + /** Video captured by the secondary camera. + */ + VIDEO_SOURCE_CAMERA_SECONDARY = 1, + /** Video for screen sharing. + */ + VIDEO_SOURCE_SCREEN_PRIMARY = 2, + VIDEO_SOURCE_SCREEN = VIDEO_SOURCE_SCREEN_PRIMARY, + /** Video for secondary screen sharing. + */ + VIDEO_SOURCE_SCREEN_SECONDARY = 3, + /** Not define. + */ + VIDEO_SOURCE_CUSTOM = 4, + /** Video for media player sharing. + */ + VIDEO_SOURCE_MEDIA_PLAYER = 5, + /** Video for png image. + */ + VIDEO_SOURCE_RTC_IMAGE_PNG = 6, + /** Video for png image. + */ + VIDEO_SOURCE_RTC_IMAGE_JPEG = 7, + /** Video for png image. + */ + VIDEO_SOURCE_RTC_IMAGE_GIF = 8, + /** Remote video received from network. + */ + VIDEO_SOURCE_REMOTE = 9, + /** Video for transcoded. + */ + VIDEO_SOURCE_TRANSCODED = 10, + + /** Video captured by the third camera. + */ + VIDEO_SOURCE_CAMERA_THIRD = 11, + /** Video captured by the fourth camera. + */ + VIDEO_SOURCE_CAMERA_FOURTH = 12, + /** Video for third screen sharing. + */ + VIDEO_SOURCE_SCREEN_THIRD = 13, + /** Video for fourth screen sharing. + */ + VIDEO_SOURCE_SCREEN_FOURTH = 14, + + VIDEO_SOURCE_UNKNOWN = 100 +}; + +/** + * Audio routes. + */ +enum AudioRoute +{ + /** + * -1: The default audio route. + */ + ROUTE_DEFAULT = -1, + /** + * The Headset. + */ + ROUTE_HEADSET = 0, + /** + * The Earpiece. + */ + ROUTE_EARPIECE = 1, + /** + * The Headset with no microphone. + */ + ROUTE_HEADSETNOMIC = 2, + /** + * The Speakerphone. + */ + ROUTE_SPEAKERPHONE = 3, + /** + * The Loudspeaker. + */ + ROUTE_LOUDSPEAKER = 4, + /** + * The Bluetooth Headset via HFP. + */ + ROUTE_HEADSETBLUETOOTH = 5, + /** + * The USB. + */ + ROUTE_USB = 6, + /** + * The HDMI. + */ + ROUTE_HDMI = 7, + /** + * The DisplayPort. + */ + ROUTE_DISPLAYPORT = 8, + /** + * The AirPlay. + */ + ROUTE_AIRPLAY = 9, + /** + * The Bluetooth Speaker via A2DP. + */ + ROUTE_BLUETOOTH_SPEAKER = 10, +}; + +/** + * Bytes per sample + */ +enum BYTES_PER_SAMPLE { + /** + * two bytes per sample + */ + TWO_BYTES_PER_SAMPLE = 2, +}; + +struct AudioParameters { + int sample_rate; + size_t channels; + size_t frames_per_buffer; + + AudioParameters() + : sample_rate(0), + channels(0), + frames_per_buffer(0) {} +}; + +/** + * The use mode of the audio data. + */ +enum RAW_AUDIO_FRAME_OP_MODE_TYPE { + /** 0: Read-only mode: Users only read the data from `AudioFrame` without modifying anything. + * For example, when users acquire the data with the Agora SDK, then start the media push. + */ + RAW_AUDIO_FRAME_OP_MODE_READ_ONLY = 0, + + /** 2: Read and write mode: Users read the data from `AudioFrame`, modify it, and then play it. + * For example, when users have their own audio-effect processing module and perform some voice pre-processing, such as a voice change. + */ + RAW_AUDIO_FRAME_OP_MODE_READ_WRITE = 2, +}; + +} // namespace rtc + +namespace media { + /** + * The type of media device. + */ +enum MEDIA_SOURCE_TYPE { + /** + * 0: The audio playback device. + */ + AUDIO_PLAYOUT_SOURCE = 0, + /** + * 1: Microphone. + */ + AUDIO_RECORDING_SOURCE = 1, + /** + * 2: Video captured by primary camera. + */ + PRIMARY_CAMERA_SOURCE = 2, + /** + * 3: Video captured by secondary camera. + */ + SECONDARY_CAMERA_SOURCE = 3, + /** + * 4: Video captured by primary screen capturer. + */ + PRIMARY_SCREEN_SOURCE = 4, + /** + * 5: Video captured by secondary screen capturer. + */ + SECONDARY_SCREEN_SOURCE = 5, + /** + * 6: Video captured by custom video source. + */ + CUSTOM_VIDEO_SOURCE = 6, + /** + * 7: Video for media player sharing. + */ + MEDIA_PLAYER_SOURCE = 7, + /** + * 8: Video for png image. + */ + RTC_IMAGE_PNG_SOURCE = 8, + /** + * 9: Video for jpeg image. + */ + RTC_IMAGE_JPEG_SOURCE = 9, + /** + * 10: Video for gif image. + */ + RTC_IMAGE_GIF_SOURCE = 10, + /** + * 11: Remote video received from network. + */ + REMOTE_VIDEO_SOURCE = 11, + /** + * 12: Video for transcoded. + */ + TRANSCODED_VIDEO_SOURCE = 12, + /** + * 100: Internal Usage only. + */ + UNKNOWN_MEDIA_SOURCE = 100 +}; +/** Definition of contentinspect + */ +#define MAX_CONTENT_INSPECT_MODULE_COUNT 32 +enum CONTENT_INSPECT_RESULT { + CONTENT_INSPECT_NEUTRAL = 1, + CONTENT_INSPECT_SEXY = 2, + CONTENT_INSPECT_PORN = 3, +}; + +enum CONTENT_INSPECT_TYPE { +/** + * (Default) content inspect type invalid + */ +CONTENT_INSPECT_INVALID = 0, +/** + * @deprecated + * Content inspect type moderation + */ +CONTENT_INSPECT_MODERATION __deprecated = 1, +/** + * Content inspect type supervise + */ +CONTENT_INSPECT_SUPERVISION = 2, +/** + * Content inspect type image moderation + */ +CONTENT_INSPECT_IMAGE_MODERATION = 3 +}; + +struct ContentInspectModule { + /** + * The content inspect module type. + */ + CONTENT_INSPECT_TYPE type; + /**The content inspect frequency, default is 0 second. + * the frequency <= 0 is invalid. + */ + unsigned int interval; + ContentInspectModule() { + type = CONTENT_INSPECT_INVALID; + interval = 0; + } +}; +/** Definition of ContentInspectConfig. + */ +struct ContentInspectConfig { + const char* extraInfo; + /** + * The specific server configuration for image moderation. Please contact technical support. + */ + const char* serverConfig; + /**The content inspect modules, max length of modules is 32. + * the content(snapshot of send video stream, image) can be used to max of 32 types functions. + */ + ContentInspectModule modules[MAX_CONTENT_INSPECT_MODULE_COUNT]; + /**The content inspect module count. + */ + int moduleCount; + ContentInspectConfig& operator=(const ContentInspectConfig& rth) + { + extraInfo = rth.extraInfo; + serverConfig = rth.serverConfig; + moduleCount = rth.moduleCount; + memcpy(&modules, &rth.modules, MAX_CONTENT_INSPECT_MODULE_COUNT * sizeof(ContentInspectModule)); + return *this; + } + ContentInspectConfig() :extraInfo(NULL), serverConfig(NULL), moduleCount(0){} +}; + +namespace base { + +typedef void* view_t; + +typedef const char* user_id_t; + +static const uint8_t kMaxCodecNameLength = 50; + +/** + * The definition of the PacketOptions struct, which contains infomation of the packet + * in the RTP (Real-time Transport Protocal) header. + */ +struct PacketOptions { + /** + * The timestamp of the packet. + */ + uint32_t timestamp; + // Audio level indication. + uint8_t audioLevelIndication; + PacketOptions() + : timestamp(0), + audioLevelIndication(127) {} +}; + +/** + * The detailed information of the incoming audio encoded frame. + */ + +struct AudioEncodedFrameInfo { + /** + * The send time of the packet. + */ + uint64_t sendTs; + /** + * The codec of the packet. + */ + uint8_t codec; + AudioEncodedFrameInfo() + : sendTs(0), + codec(0) {} +}; + +/** + * The detailed information of the incoming audio frame in the PCM format. + */ +struct AudioPcmFrame { + /** + * The buffer size of the PCM audio frame. + */ + OPTIONAL_ENUM_SIZE_T { + // Stereo, 32 kHz, 60 ms (2 * 32 * 60) + /** + * The max number of the samples of the data. + * + * When the number of audio channel is two, the sample rate is 32 kHZ, + * the buffer length of the data is 60 ms, the number of the samples of the data is 3840 (2 x 32 x 60). + */ + kMaxDataSizeSamples = 3840, + /** The max number of the bytes of the data. */ + kMaxDataSizeBytes = kMaxDataSizeSamples * sizeof(int16_t), + }; + + /** The timestamp (ms) of the audio frame. + */ + int64_t capture_timestamp; + /** The number of samples per channel. + */ + size_t samples_per_channel_; + /** The sample rate (Hz) of the audio data. + */ + int sample_rate_hz_; + /** The channel number. + */ + size_t num_channels_; + /** The number of bytes per sample. + */ + rtc::BYTES_PER_SAMPLE bytes_per_sample; + /** The audio frame data. */ + int16_t data_[kMaxDataSizeSamples]; + + AudioPcmFrame& operator=(const AudioPcmFrame& src) { + if(this == &src) { + return *this; + } + + this->capture_timestamp = src.capture_timestamp; + this->samples_per_channel_ = src.samples_per_channel_; + this->sample_rate_hz_ = src.sample_rate_hz_; + this->bytes_per_sample = src.bytes_per_sample; + this->num_channels_ = src.num_channels_; + + size_t length = src.samples_per_channel_ * src.num_channels_; + if (length > kMaxDataSizeSamples) { + length = kMaxDataSizeSamples; + } + + memcpy(this->data_, src.data_, length * sizeof(int16_t)); + + return *this; + } + + AudioPcmFrame() + : capture_timestamp(0), + samples_per_channel_(0), + sample_rate_hz_(0), + num_channels_(0), + bytes_per_sample(rtc::TWO_BYTES_PER_SAMPLE) { + memset(data_, 0, sizeof(data_)); + } + + AudioPcmFrame(const AudioPcmFrame& src) + : capture_timestamp(src.capture_timestamp), + samples_per_channel_(src.samples_per_channel_), + sample_rate_hz_(src.sample_rate_hz_), + num_channels_(src.num_channels_), + bytes_per_sample(src.bytes_per_sample) { + size_t length = src.samples_per_channel_ * src.num_channels_; + if (length > kMaxDataSizeSamples) { + length = kMaxDataSizeSamples; + } + + memcpy(this->data_, src.data_, length * sizeof(int16_t)); + } +}; + +/** Audio dual-mono output mode + */ +enum AUDIO_DUAL_MONO_MODE { + /**< ChanLOut=ChanLin, ChanRout=ChanRin */ + AUDIO_DUAL_MONO_STEREO = 0, + /**< ChanLOut=ChanRout=ChanLin */ + AUDIO_DUAL_MONO_L = 1, + /**< ChanLOut=ChanRout=ChanRin */ + AUDIO_DUAL_MONO_R = 2, + /**< ChanLout=ChanRout=(ChanLin+ChanRin)/2 */ + AUDIO_DUAL_MONO_MIX = 3 +}; + +/** + * Video pixel formats. + */ +enum VIDEO_PIXEL_FORMAT { + /** + * 0: Default format. + */ + VIDEO_PIXEL_DEFAULT = 0, + /** + * 1: I420. + */ + VIDEO_PIXEL_I420 = 1, + /** + * 2: BGRA. + */ + VIDEO_PIXEL_BGRA = 2, + /** + * 3: NV21. + */ + VIDEO_PIXEL_NV21 = 3, + /** + * 4: RGBA. + */ + VIDEO_PIXEL_RGBA = 4, + /** + * 8: NV12. + */ + VIDEO_PIXEL_NV12 = 8, + /** + * 10: GL_TEXTURE_2D + */ + VIDEO_TEXTURE_2D = 10, + /** + * 11: GL_TEXTURE_OES + */ + VIDEO_TEXTURE_OES = 11, + /* + 12: pixel format for iOS CVPixelBuffer NV12 + */ + VIDEO_CVPIXEL_NV12 = 12, + /* + 13: pixel format for iOS CVPixelBuffer I420 + */ + VIDEO_CVPIXEL_I420 = 13, + /* + 14: pixel format for iOS CVPixelBuffer BGRA + */ + VIDEO_CVPIXEL_BGRA = 14, + /** + * 16: I422. + */ + VIDEO_PIXEL_I422 = 16, + /** + * 17: ID3D11Texture2D, only support DXGI_FORMAT_B8G8R8A8_UNORM, DXGI_FORMAT_B8G8R8A8_TYPELESS, DXGI_FORMAT_NV12 texture format + */ + VIDEO_TEXTURE_ID3D11TEXTURE2D = 17, +}; + +/** + * The video display mode. + */ +enum RENDER_MODE_TYPE { + /** + * 1: Uniformly scale the video until it fills the visible boundaries + * (cropped). One dimension of the video may have clipped contents. + */ + RENDER_MODE_HIDDEN = 1, + /** + * 2: Uniformly scale the video until one of its dimension fits the boundary + * (zoomed to fit). Areas that are not filled due to the disparity in the + * aspect ratio will be filled with black. + */ + RENDER_MODE_FIT = 2, + /** + * @deprecated + * 3: This mode is deprecated. + */ + RENDER_MODE_ADAPTIVE __deprecated = 3, +}; + +/** + * The camera video source type + */ +enum CAMERA_VIDEO_SOURCE_TYPE { + /** + * 0: the video frame comes from the front camera + */ + CAMERA_SOURCE_FRONT = 0, + /** + * 1: the video frame comes from the back camera + */ + CAMERA_SOURCE_BACK = 1, + /** + * 1: the video frame source is unsepcified + */ + VIDEO_SOURCE_UNSPECIFIED = 2, +}; + +/** + * The IVideoFrameMetaInfo class. + * This interface provides access to metadata information. + */ +class IVideoFrameMetaInfo { + public: + enum META_INFO_KEY { + KEY_FACE_CAPTURE = 0, + }; + virtual ~IVideoFrameMetaInfo() {}; + virtual const char* getMetaInfoStr(META_INFO_KEY key) const = 0; +}; + +/** + * The definition of the ExternalVideoFrame struct. + */ +struct ExternalVideoFrame { + ExternalVideoFrame() + : type(VIDEO_BUFFER_RAW_DATA), + format(VIDEO_PIXEL_DEFAULT), + buffer(NULL), + stride(0), + height(0), + cropLeft(0), + cropTop(0), + cropRight(0), + cropBottom(0), + rotation(0), + timestamp(0), + eglContext(NULL), + eglType(EGL_CONTEXT10), + textureId(0), + metadata_buffer(NULL), + metadata_size(0), + alphaBuffer(NULL), + d3d11_texture_2d(NULL), + texture_slice_index(0){} + + /** + * The EGL context type. + */ + enum EGL_CONTEXT_TYPE { + /** + * 0: When using the OpenGL interface (javax.microedition.khronos.egl.*) defined by Khronos + */ + EGL_CONTEXT10 = 0, + /** + * 0: When using the OpenGL interface (android.opengl.*) defined by Android + */ + EGL_CONTEXT14 = 1, + }; + + /** + * Video buffer types. + */ + enum VIDEO_BUFFER_TYPE { + /** + * 1: Raw data. + */ + VIDEO_BUFFER_RAW_DATA = 1, + /** + * 2: The same as VIDEO_BUFFER_RAW_DATA. + */ + VIDEO_BUFFER_ARRAY = 2, + /** + * 3: The video buffer in the format of texture. + */ + VIDEO_BUFFER_TEXTURE = 3, + }; + + /** + * The buffer type: #VIDEO_BUFFER_TYPE. + */ + VIDEO_BUFFER_TYPE type; + /** + * The pixel format: #VIDEO_PIXEL_FORMAT + */ + VIDEO_PIXEL_FORMAT format; + /** + * The video buffer. + */ + void* buffer; + /** + * The line spacing of the incoming video frame (px). For + * texture, it is the width of the texture. + */ + int stride; + /** + * The height of the incoming video frame. + */ + int height; + /** + * [Raw data related parameter] The number of pixels trimmed from the left. The default value is + * 0. + */ + int cropLeft; + /** + * [Raw data related parameter] The number of pixels trimmed from the top. The default value is + * 0. + */ + int cropTop; + /** + * [Raw data related parameter] The number of pixels trimmed from the right. The default value is + * 0. + */ + int cropRight; + /** + * [Raw data related parameter] The number of pixels trimmed from the bottom. The default value + * is 0. + */ + int cropBottom; + /** + * [Raw data related parameter] The clockwise rotation information of the video frame. You can set the + * rotation angle as 0, 90, 180, or 270. The default value is 0. + */ + int rotation; + /** + * The timestamp (ms) of the incoming video frame. An incorrect timestamp results in a frame loss or + * unsynchronized audio and video. + * + * Please refer to getAgoraCurrentMonotonicTimeInMs or getCurrentMonotonicTimeInMs + * to determine how to fill this filed. + */ + long long timestamp; + /** + * [Texture-related parameter] + * When using the OpenGL interface (javax.microedition.khronos.egl.*) defined by Khronos, set EGLContext to this field. + * When using the OpenGL interface (android.opengl.*) defined by Android, set EGLContext to this field. + */ + void *eglContext; + /** + * [Texture related parameter] Texture ID used by the video frame. + */ + EGL_CONTEXT_TYPE eglType; + /** + * [Texture related parameter] Incoming 4 × 4 transformational matrix. The typical value is a unit matrix. + */ + int textureId; + /** + * [Texture related parameter] Incoming 4 × 4 transformational matrix. The typical value is a unit matrix. + */ + float matrix[16]; + /** + * [Texture related parameter] The MetaData buffer. + * The default value is NULL + */ + uint8_t* metadata_buffer; + /** + * [Texture related parameter] The MetaData size. + * The default value is 0 + */ + int metadata_size; + /** + * Indicates the output data of the portrait segmentation algorithm, which is consistent with the size of the video frame. + * The value range of each pixel is [0,255], where 0 represents the background; 255 represents the foreground (portrait). + * The default value is NULL + */ + uint8_t* alphaBuffer; + + /** + * [For Windows only] The pointer of ID3D11Texture2D used by the video frame. + */ + void *d3d11_texture_2d; + + /** + * [For Windows only] The index of ID3D11Texture2D array used by the video frame. + */ + int texture_slice_index; +}; + +/** + * The definition of the VideoFrame struct. + */ +struct VideoFrame { + VideoFrame(): + type(VIDEO_PIXEL_DEFAULT), + width(0), + height(0), + yStride(0), + uStride(0), + vStride(0), + yBuffer(NULL), + uBuffer(NULL), + vBuffer(NULL), + rotation(0), + renderTimeMs(0), + avsync_type(0), + metadata_buffer(NULL), + metadata_size(0), + sharedContext(0), + textureId(0), + d3d11Texture2d(NULL), + alphaBuffer(NULL), + pixelBuffer(NULL), + metaInfo(NULL){ + memset(matrix, 0, sizeof(matrix)); + } + /** + * The video pixel format: #VIDEO_PIXEL_FORMAT. + */ + VIDEO_PIXEL_FORMAT type; + /** + * The width of the video frame. + */ + int width; + /** + * The height of the video frame. + */ + int height; + /** + * The line span of Y buffer in the YUV data. + */ + int yStride; + /** + * The line span of U buffer in the YUV data. + */ + int uStride; + /** + * The line span of V buffer in the YUV data. + */ + int vStride; + /** + * The pointer to the Y buffer in the YUV data. + */ + uint8_t* yBuffer; + /** + * The pointer to the U buffer in the YUV data. + */ + uint8_t* uBuffer; + /** + * The pointer to the V buffer in the YUV data. + */ + uint8_t* vBuffer; + /** + * The clockwise rotation information of this frame. You can set it as 0, 90, 180 or 270. + */ + int rotation; + /** + * The timestamp to render the video stream. Use this parameter for audio-video synchronization when + * rendering the video. + * + * @note This parameter is for rendering the video, not capturing the video. + */ + int64_t renderTimeMs; + /** + * The type of audio-video synchronization. + */ + int avsync_type; + /** + * [Texture related parameter] The MetaData buffer. + * The default value is NULL + */ + uint8_t* metadata_buffer; + /** + * [Texture related parameter] The MetaData size. + * The default value is 0 + */ + int metadata_size; + /** + * [Texture related parameter], egl context. + */ + void* sharedContext; + /** + * [Texture related parameter], Texture ID used by the video frame. + */ + int textureId; + /** + * [Texture related parameter] The pointer of ID3D11Texture2D used by the video frame,for Windows only. + */ + void* d3d11Texture2d; + /** + * [Texture related parameter], Incoming 4 × 4 transformational matrix. + */ + float matrix[16]; + /** + * Indicates the output data of the portrait segmentation algorithm, which is consistent with the size of the video frame. + * The value range of each pixel is [0,255], where 0 represents the background; 255 represents the foreground (portrait). + * The default value is NULL + */ + uint8_t* alphaBuffer; + /** + *The type of CVPixelBufferRef, for iOS and macOS only. + */ + void* pixelBuffer; + /** + * The pointer to IVideoFrameMetaInfo, which is the interface to get metainfo contents from VideoFrame. + */ + IVideoFrameMetaInfo* metaInfo; +}; + +/** + * The IVideoFrameObserver class. + */ +class IVideoFrameObserver { + public: + /** + * Occurs each time the player receives a video frame. + * + * After registering the video frame observer, + * the callback occurs each time the player receives a video frame to report the detailed information of the video frame. + * @param frame The detailed information of the video frame. See {@link VideoFrame}. + */ + virtual void onFrame(const VideoFrame* frame) = 0; + virtual ~IVideoFrameObserver() {} + virtual bool isExternal() { return true; } + virtual VIDEO_PIXEL_FORMAT getVideoFormatPreference() { return VIDEO_PIXEL_DEFAULT; } +}; + +enum MEDIA_PLAYER_SOURCE_TYPE { + /** + * The real type of media player when use MEDIA_PLAYER_SOURCE_DEFAULT is decided by the + * type of SDK package. It is full feature media player in full-featured SDK, or simple + * media player in others. + */ + MEDIA_PLAYER_SOURCE_DEFAULT, + /** + * Full featured media player is designed to support more codecs and media format, which + * requires more package size than simple player. If you need this player enabled, you + * might need to download a full-featured SDK. + */ + MEDIA_PLAYER_SOURCE_FULL_FEATURED, + /** + * Simple media player with limit codec supported, which requires minimal package size + * requirement and is enabled by default + */ + MEDIA_PLAYER_SOURCE_SIMPLE, +}; + +enum VIDEO_MODULE_POSITION { + POSITION_POST_CAPTURER = 1 << 0, + POSITION_PRE_RENDERER = 1 << 1, + POSITION_PRE_ENCODER = 1 << 2, + POSITION_POST_CAPTURER_ORIGIN = 1 << 3, +}; + +} // namespace base + +/** + * The audio frame observer. + */ +class IAudioPcmFrameSink { + public: + /** + * Occurs when each time the player receives an audio frame. + * + * After registering the audio frame observer, + * the callback occurs when each time the player receives an audio frame, + * reporting the detailed information of the audio frame. + * @param frame The detailed information of the audio frame. See {@link AudioPcmFrame}. + */ + virtual void onFrame(agora::media::base::AudioPcmFrame* frame) = 0; + virtual ~IAudioPcmFrameSink() {} +}; + +/** + * The IAudioFrameObserverBase class. + */ +class IAudioFrameObserverBase { + public: + /** + * Audio frame types. + */ + enum AUDIO_FRAME_TYPE { + /** + * 0: 16-bit PCM. + */ + FRAME_TYPE_PCM16 = 0, + }; + enum { MAX_HANDLE_TIME_CNT = 10 }; + /** + * The definition of the AudioFrame struct. + */ + struct AudioFrame { + /** + * The audio frame type: #AUDIO_FRAME_TYPE. + */ + AUDIO_FRAME_TYPE type; + /** + * The number of samples per channel in this frame. + */ + int samplesPerChannel; + /** + * The number of bytes per sample: #BYTES_PER_SAMPLE + */ + agora::rtc::BYTES_PER_SAMPLE bytesPerSample; + /** + * The number of audio channels (data is interleaved, if stereo). + * - 1: Mono. + * - 2: Stereo. + */ + int channels; + /** + * The sample rate + */ + int samplesPerSec; + /** + * The data buffer of the audio frame. When the audio frame uses a stereo channel, the data + * buffer is interleaved. + * + * Buffer data size: buffer = samplesPerChannel 脳 channels 脳 bytesPerSample. + */ + void* buffer; + /** + * The timestamp to render the audio data. + * + * You can use this timestamp to restore the order of the captured audio frame, and synchronize + * audio and video frames in video scenarios, including scenarios where external video sources + * are used. + */ + int64_t renderTimeMs; + /** + * A reserved parameter. + * + * You can use this presentationMs parameter to indicate the presenation milisecond timestamp, + * this will then filled into audio4 extension part, the remote side could use this pts in av + * sync process with video frame. + */ + int avsync_type; + /** + * The pts timestamp of this audio frame. + * + * This timestamp is used to indicate the origin pts time of the frame, and sync with video frame by + * the pts time stamp + */ + int64_t presentationMs; + /** + * The number of the audio track. + */ + int audioTrackNumber; + + AudioFrame() : type(FRAME_TYPE_PCM16), + samplesPerChannel(0), + bytesPerSample(rtc::TWO_BYTES_PER_SAMPLE), + channels(0), + samplesPerSec(0), + buffer(NULL), + renderTimeMs(0), + avsync_type(0), + presentationMs(0), + audioTrackNumber(0) {} + }; + + enum AUDIO_FRAME_POSITION { + AUDIO_FRAME_POSITION_NONE = 0x0000, + /** The position for observing the playback audio of all remote users after mixing + */ + AUDIO_FRAME_POSITION_PLAYBACK = 0x0001, + /** The position for observing the recorded audio of the local user + */ + AUDIO_FRAME_POSITION_RECORD = 0x0002, + /** The position for observing the mixed audio of the local user and all remote users + */ + AUDIO_FRAME_POSITION_MIXED = 0x0004, + /** The position for observing the audio of a single remote user before mixing + */ + AUDIO_FRAME_POSITION_BEFORE_MIXING = 0x0008, + /** The position for observing the ear monitoring audio of the local user + */ + AUDIO_FRAME_POSITION_EAR_MONITORING = 0x0010, + }; + + struct AudioParams { + /** The audio sample rate (Hz), which can be set as one of the following values: + + - `8000` + - `16000` (Default) + - `32000` + - `44100 ` + - `48000` + */ + int sample_rate; + + /* The number of audio channels, which can be set as either of the following values: + + - `1`: Mono (Default) + - `2`: Stereo + */ + int channels; + + /* The use mode of the audio data. See AgoraAudioRawFrameOperationMode. + */ + rtc::RAW_AUDIO_FRAME_OP_MODE_TYPE mode; + + /** The number of samples. For example, set it as 1024 for RTMP or RTMPS + streaming. + */ + int samples_per_call; + + AudioParams() : sample_rate(0), channels(0), mode(rtc::RAW_AUDIO_FRAME_OP_MODE_READ_ONLY), samples_per_call(0) {} + AudioParams(int samplerate, int channel, rtc::RAW_AUDIO_FRAME_OP_MODE_TYPE type, int samplesPerCall) : sample_rate(samplerate), channels(channel), mode(type), samples_per_call(samplesPerCall) {} + }; + + public: + virtual ~IAudioFrameObserverBase() {} + + /** + * Occurs when the recorded audio frame is received. + * @param channelId The channel name + * @param audioFrame The reference to the audio frame: AudioFrame. + * @return + * - true: The recorded audio frame is valid and is encoded and sent. + * - false: The recorded audio frame is invalid and is not encoded or sent. + */ + virtual bool onRecordAudioFrame(const char* channelId, AudioFrame& audioFrame) = 0; + /** + * Occurs when the playback audio frame is received. + * @param channelId The channel name + * @param audioFrame The reference to the audio frame: AudioFrame. + * @return + * - true: The playback audio frame is valid and is encoded and sent. + * - false: The playback audio frame is invalid and is not encoded or sent. + */ + virtual bool onPlaybackAudioFrame(const char* channelId, AudioFrame& audioFrame) = 0; + /** + * Occurs when the mixed audio data is received. + * @param channelId The channel name + * @param audioFrame The reference to the audio frame: AudioFrame. + * @return + * - true: The mixed audio data is valid and is encoded and sent. + * - false: The mixed audio data is invalid and is not encoded or sent. + */ + virtual bool onMixedAudioFrame(const char* channelId, AudioFrame& audioFrame) = 0; + /** + * Occurs when the ear monitoring audio frame is received. + * @param audioFrame The reference to the audio frame: AudioFrame. + * @return + * - true: The ear monitoring audio data is valid and is encoded and sent. + * - false: The ear monitoring audio data is invalid and is not encoded or sent. + */ + virtual bool onEarMonitoringAudioFrame(AudioFrame& audioFrame) = 0; + /** + * Occurs when the before-mixing playback audio frame is received. + * @param channelId The channel name + * @param userId ID of the remote user. + * @param audioFrame The reference to the audio frame: AudioFrame. + * @return + * - true: The before-mixing playback audio frame is valid and is encoded and sent. + * - false: The before-mixing playback audio frame is invalid and is not encoded or sent. + */ + virtual bool onPlaybackAudioFrameBeforeMixing(const char* channelId, base::user_id_t userId, AudioFrame& audioFrame) { + (void) channelId; + (void) userId; + (void) audioFrame; + return true; + } + + /** + * Sets the frame position for the audio observer. + * @return A bit mask that controls the frame position of the audio observer. + * @note - Use '|' (the OR operator) to observe multiple frame positions. + *

    + * After you successfully register the audio observer, the SDK triggers this callback each time it receives a audio frame. You can determine which position to observe by setting the return value. + * The SDK provides 4 positions for observer. Each position corresponds to a callback function: + * - `AUDIO_FRAME_POSITION_PLAYBACK (1 << 0)`: The position for playback audio frame is received, which corresponds to the \ref onPlaybackFrame "onPlaybackFrame" callback. + * - `AUDIO_FRAME_POSITION_RECORD (1 << 1)`: The position for record audio frame is received, which corresponds to the \ref onRecordFrame "onRecordFrame" callback. + * - `AUDIO_FRAME_POSITION_MIXED (1 << 2)`: The position for mixed audio frame is received, which corresponds to the \ref onMixedFrame "onMixedFrame" callback. + * - `AUDIO_FRAME_POSITION_BEFORE_MIXING (1 << 3)`: The position for playback audio frame before mixing is received, which corresponds to the \ref onPlaybackFrameBeforeMixing "onPlaybackFrameBeforeMixing" callback. + * @return The bit mask that controls the audio observation positions. + * See AUDIO_FRAME_POSITION. + */ + + virtual int getObservedAudioFramePosition() = 0; + + /** Sets the audio playback format + **Note**: + + - The SDK calculates the sample interval according to the `AudioParams` + you set in the return value of this callback and triggers the + `onPlaybackAudioFrame` callback at the calculated sample interval. + Sample interval (seconds) = `samplesPerCall`/(`sampleRate` 脳 `channel`). + Ensure that the value of sample interval is equal to or greater than 0.01. + + @return Sets the audio format. See AgoraAudioParams. + */ + virtual AudioParams getPlaybackAudioParams() = 0; + + /** Sets the audio recording format + **Note**: + - The SDK calculates the sample interval according to the `AudioParams` + you set in the return value of this callback and triggers the + `onRecordAudioFrame` callback at the calculated sample interval. + Sample interval (seconds) = `samplesPerCall`/(`sampleRate` 脳 `channel`). + Ensure that the value of sample interval is equal to or greater than 0.01. + + @return Sets the audio format. See AgoraAudioParams. + */ + virtual AudioParams getRecordAudioParams() = 0; + + /** Sets the audio mixing format + **Note**: + - The SDK calculates the sample interval according to the `AudioParams` + you set in the return value of this callback and triggers the + `onMixedAudioFrame` callback at the calculated sample interval. + Sample interval (seconds) = `samplesPerCall`/(`sampleRate` 脳 `channel`). + Ensure that the value of sample interval is equal to or greater than 0.01. + + @return Sets the audio format. See AgoraAudioParams. + */ + virtual AudioParams getMixedAudioParams() = 0; + + /** Sets the ear monitoring audio format + **Note**: + - The SDK calculates the sample interval according to the `AudioParams` + you set in the return value of this callback and triggers the + `onEarMonitoringAudioFrame` callback at the calculated sample interval. + Sample interval (seconds) = `samplesPerCall`/(`sampleRate` 脳 `channel`). + Ensure that the value of sample interval is equal to or greater than 0.01. + + @return Sets the audio format. See AgoraAudioParams. + */ + virtual AudioParams getEarMonitoringAudioParams() = 0; +}; + +/** + * The IAudioFrameObserver class. + */ +class IAudioFrameObserver : public IAudioFrameObserverBase { + public: + using IAudioFrameObserverBase::onPlaybackAudioFrameBeforeMixing; + /** + * Occurs when the before-mixing playback audio frame is received. + * @param channelId The channel name + * @param uid ID of the remote user. + * @param audioFrame The reference to the audio frame: AudioFrame. + * @return + * - true: The before-mixing playback audio frame is valid and is encoded and sent. + * - false: The before-mixing playback audio frame is invalid and is not encoded or sent. + */ + virtual bool onPlaybackAudioFrameBeforeMixing(const char* channelId, rtc::uid_t uid, AudioFrame& audioFrame) = 0; +}; + +struct AudioSpectrumData { + /** + * The audio spectrum data of audio. + */ + const float *audioSpectrumData; + /** + * The data length of audio spectrum data. + */ + int dataLength; + + AudioSpectrumData() : audioSpectrumData(NULL), dataLength(0) {} + AudioSpectrumData(const float *data, int length) : + audioSpectrumData(data), dataLength(length) {} +}; + +struct UserAudioSpectrumInfo { + /** + * User ID of the speaker. + */ + agora::rtc::uid_t uid; + /** + * The audio spectrum data of audio. + */ + struct AudioSpectrumData spectrumData; + + UserAudioSpectrumInfo() : uid(0) {} + + UserAudioSpectrumInfo(agora::rtc::uid_t uid, const float* data, int length) : uid(uid), spectrumData(data, length) {} +}; + +/** + * The IAudioSpectrumObserver class. + */ +class IAudioSpectrumObserver { +public: + virtual ~IAudioSpectrumObserver() {} + + /** + * Reports the audio spectrum of local audio. + * + * This callback reports the audio spectrum data of the local audio at the moment + * in the channel. + * + * You can set the time interval of this callback using \ref ILocalUser::enableAudioSpectrumMonitor "enableAudioSpectrumMonitor". + * + * @param data The audio spectrum data of local audio. + * - true: Processed. + * - false: Not processed. + */ + virtual bool onLocalAudioSpectrum(const AudioSpectrumData& data) = 0; + /** + * Reports the audio spectrum of remote user. + * + * This callback reports the IDs and audio spectrum data of the loudest speakers at the moment + * in the channel. + * + * You can set the time interval of this callback using \ref ILocalUser::enableAudioSpectrumMonitor "enableAudioSpectrumMonitor". + * + * @param spectrums The pointer to \ref agora::media::UserAudioSpectrumInfo "UserAudioSpectrumInfo", which is an array containing + * the user ID and audio spectrum data for each speaker. + * - This array contains the following members: + * - `uid`, which is the UID of each remote speaker + * - `spectrumData`, which reports the audio spectrum of each remote speaker. + * @param spectrumNumber The array length of the spectrums. + * - true: Processed. + * - false: Not processed. + */ + virtual bool onRemoteAudioSpectrum(const UserAudioSpectrumInfo* spectrums, unsigned int spectrumNumber) = 0; +}; + +/** + * The IVideoEncodedFrameObserver class. + */ +class IVideoEncodedFrameObserver { + public: + /** + * Occurs each time the SDK receives an encoded video image. + * @param uid The user id of remote user. + * @param imageBuffer The pointer to the video image buffer. + * @param length The data length of the video image. + * @param videoEncodedFrameInfo The information of the encoded video frame: EncodedVideoFrameInfo. + * @return Determines whether to accept encoded video image. + * - true: Accept. + * - false: Do not accept. + */ + virtual bool onEncodedVideoFrameReceived(rtc::uid_t uid, const uint8_t* imageBuffer, size_t length, + const rtc::EncodedVideoFrameInfo& videoEncodedFrameInfo) = 0; + + virtual ~IVideoEncodedFrameObserver() {} +}; + +/** + * The IVideoFrameObserver class. + */ +class IVideoFrameObserver { + public: + typedef media::base::VideoFrame VideoFrame; + /** + * The process mode of the video frame: + */ + enum VIDEO_FRAME_PROCESS_MODE { + /** + * Read-only mode. + * + * In this mode, you do not modify the video frame. The video frame observer is a renderer. + */ + PROCESS_MODE_READ_ONLY, // Observer works as a pure renderer and will not modify the original frame. + /** + * Read and write mode. + * + * In this mode, you modify the video frame. The video frame observer is a video filter. + */ + PROCESS_MODE_READ_WRITE, // Observer works as a filter that will process the video frame and affect the following frame processing in SDK. + }; + + public: + virtual ~IVideoFrameObserver() {} + + /** + * Occurs each time the SDK receives a video frame captured by the local camera. + * + * After you successfully register the video frame observer, the SDK triggers this callback each time + * a video frame is received. In this callback, you can get the video data captured by the local + * camera. You can then pre-process the data according to your scenarios. + * + * After pre-processing, you can send the processed video data back to the SDK by setting the + * `videoFrame` parameter in this callback. + * + * @note + * - If you get the video data in RGBA color encoding format, Agora does not support using this callback to send the processed data in RGBA color encoding format back to the SDK. + * - The video data that this callback gets has not been pre-processed, such as watermarking, cropping content, rotating, or image enhancement. + * + * @param videoFrame A pointer to the video frame: VideoFrame + * @param sourceType source type of video frame. See #VIDEO_SOURCE_TYPE. + * @return Determines whether to ignore the current video frame if the pre-processing fails: + * - true: Do not ignore. + * - false: Ignore, in which case this method does not sent the current video frame to the SDK. + */ + virtual bool onCaptureVideoFrame(agora::rtc::VIDEO_SOURCE_TYPE sourceType, VideoFrame& videoFrame) = 0; + + /** + * Occurs each time the SDK receives a video frame before encoding. + * + * After you successfully register the video frame observer, the SDK triggers this callback each time + * when it receives a video frame. In this callback, you can get the video data before encoding. You can then + * process the data according to your particular scenarios. + * + * After processing, you can send the processed video data back to the SDK by setting the + * `videoFrame` parameter in this callback. + * + * @note + * - To get the video data captured from the second screen before encoding, you need to set (1 << 2) as a frame position through `getObservedFramePosition`. + * - The video data that this callback gets has been pre-processed, such as watermarking, cropping content, rotating, or image enhancement. + * - This callback does not support sending processed RGBA video data back to the SDK. + * + * @param videoFrame A pointer to the video frame: VideoFrame + * @param sourceType source type of video frame. See #VIDEO_SOURCE_TYPE. + * @return Determines whether to ignore the current video frame if the pre-processing fails: + * - true: Do not ignore. + * - false: Ignore, in which case this method does not sent the current video frame to the SDK. + */ + virtual bool onPreEncodeVideoFrame(agora::rtc::VIDEO_SOURCE_TYPE sourceType, VideoFrame& videoFrame) = 0; + + /** + * Occurs each time the SDK receives a video frame decoded by the MediaPlayer. + * + * After you successfully register the video frame observer, the SDK triggers this callback each + * time a video frame is decoded. In this callback, you can get the video data decoded by the + * MediaPlayer. You can then pre-process the data according to your scenarios. + * + * After pre-processing, you can send the processed video data back to the SDK by setting the + * `videoFrame` parameter in this callback. + * + * @note + * - This callback will not be affected by the return values of \ref getVideoFrameProcessMode "getVideoFrameProcessMode", \ref getRotationApplied "getRotationApplied", \ref getMirrorApplied "getMirrorApplied", \ref getObservedFramePosition "getObservedFramePosition". + * - On Android, this callback is not affected by the return value of \ref getVideoFormatPreference "getVideoFormatPreference" + * + * @param videoFrame A pointer to the video frame: VideoFrame + * @param mediaPlayerId ID of the mediaPlayer. + * @return Determines whether to ignore the current video frame if the pre-processing fails: + * - true: Do not ignore. + * - false: Ignore, in which case this method does not sent the current video frame to the SDK. + */ + virtual bool onMediaPlayerVideoFrame(VideoFrame& videoFrame, int mediaPlayerId) = 0; + + /** + * Occurs each time the SDK receives a video frame sent by the remote user. + * + * After you successfully register the video frame observer, the SDK triggers this callback each time a + * video frame is received. In this callback, you can get the video data sent by the remote user. You + * can then post-process the data according to your scenarios. + * + * After post-processing, you can send the processed data back to the SDK by setting the `videoFrame` + * parameter in this callback. + * + * @note This callback does not support sending processed RGBA video data back to the SDK. + * + * @param channelId The channel name + * @param remoteUid ID of the remote user who sends the current video frame. + * @param videoFrame A pointer to the video frame: VideoFrame + * @return Determines whether to ignore the current video frame if the post-processing fails: + * - true: Do not ignore. + * - false: Ignore, in which case this method does not sent the current video frame to the SDK. + */ + virtual bool onRenderVideoFrame(const char* channelId, rtc::uid_t remoteUid, VideoFrame& videoFrame) = 0; + + virtual bool onTranscodedVideoFrame(VideoFrame& videoFrame) = 0; + + /** + * Occurs each time the SDK receives a video frame and prompts you to set the process mode of the video frame. + * + * After you successfully register the video frame observer, the SDK triggers this callback each time it receives + * a video frame. You need to set your preferred process mode in the return value of this callback. + * @return VIDEO_FRAME_PROCESS_MODE. + */ + virtual VIDEO_FRAME_PROCESS_MODE getVideoFrameProcessMode() { + return PROCESS_MODE_READ_ONLY; + } + + /** + * Sets the format of the raw video data output by the SDK. + * + * If you want to get raw video data in a color encoding format other than YUV 420, register this callback when + * calling `registerVideoFrameObserver`. After you successfully register the video frame observer, the SDK triggers + * this callback each time it receives a video frame. You need to set your preferred video data in the return value + * of this callback. + * + * @note If you want the video captured by the sender to be the original format, set the original video data format + * to VIDEO_PIXEL_DEFAULT in the return value. On different platforms, the original video pixel format is also + * different, for the actual video pixel format, see `VideoFrame`. + * + * @return Sets the video format. See VIDEO_PIXEL_FORMAT. + */ + virtual base::VIDEO_PIXEL_FORMAT getVideoFormatPreference() { return base::VIDEO_PIXEL_DEFAULT; } + + /** + * Occurs each time the SDK receives a video frame, and prompts you whether to rotate the captured video. + * + * If you want to rotate the captured video according to the rotation member in the `VideoFrame` class, register this + * callback by calling `registerVideoFrameObserver`. After you successfully register the video frame observer, the + * SDK triggers this callback each time it receives a video frame. You need to set whether to rotate the video frame + * in the return value of this callback. + * + * @note This function only supports video data in RGBA or YUV420. + * + * @return Determines whether to rotate. + * - `true`: Rotate the captured video. + * - `false`: (Default) Do not rotate the captured video. + */ + virtual bool getRotationApplied() { return false; } + + /** + * Occurs each time the SDK receives a video frame and prompts you whether or not to mirror the captured video. + * + * If the video data you want to obtain is a mirror image of the original video, you need to register this callback + * when calling `registerVideoFrameObserver`. After you successfully register the video frame observer, the SDK + * triggers this callback each time it receives a video frame. You need to set whether or not to mirror the video + * frame in the return value of this callback. + * + * @note This function only supports video data in RGBA and YUV420 formats. + * + * @return Determines whether to mirror. + * - `true`: Mirror the captured video. + * - `false`: (Default) Do not mirror the captured video. + */ + virtual bool getMirrorApplied() { return false; } + + /** + * Sets the frame position for the video observer. + * + * After you successfully register the video observer, the SDK triggers this callback each time it receives + * a video frame. You can determine which position to observe by setting the return value. The SDK provides + * 3 positions for observer. Each position corresponds to a callback function: + * + * POSITION_POST_CAPTURER(1 << 0): The position after capturing the video data, which corresponds to the onCaptureVideoFrame callback. + * POSITION_PRE_RENDERER(1 << 1): The position before receiving the remote video data, which corresponds to the onRenderVideoFrame callback. + * POSITION_PRE_ENCODER(1 << 2): The position before encoding the video data, which corresponds to the onPreEncodeVideoFrame callback. + * + * To observe multiple frame positions, use '|' (the OR operator). + * This callback observes POSITION_POST_CAPTURER(1 << 0) and POSITION_PRE_RENDERER(1 << 1) by default. + * To conserve the system consumption, you can reduce the number of frame positions that you want to observe. + * + * @return A bit mask that controls the frame position of the video observer: VIDEO_OBSERVER_POSITION. + */ + virtual uint32_t getObservedFramePosition() { + return base::POSITION_POST_CAPTURER | base::POSITION_PRE_RENDERER; + } + + /** + * Indicate if the observer is for internal use. + * Note: Never override this function + * @return + * - true: the observer is for external use + * - false: the observer is for internal use + */ + virtual bool isExternal() { return true; } +}; + +/** + * The external video source type. + */ +enum EXTERNAL_VIDEO_SOURCE_TYPE { + /** + * 0: non-encoded video frame. + */ + VIDEO_FRAME = 0, + /** + * 1: encoded video frame. + */ + ENCODED_VIDEO_FRAME, +}; + +/** + * The format of the recording file. + * + * @since v3.5.2 + */ +enum MediaRecorderContainerFormat { + /** + * 1: (Default) MP4. + */ + FORMAT_MP4 = 1, +}; +/** + * The recording content. + * + * @since v3.5.2 + */ +enum MediaRecorderStreamType { + /** + * Only audio. + */ + STREAM_TYPE_AUDIO = 0x01, + /** + * Only video. + */ + STREAM_TYPE_VIDEO = 0x02, + /** + * (Default) Audio and video. + */ + STREAM_TYPE_BOTH = STREAM_TYPE_AUDIO | STREAM_TYPE_VIDEO, +}; +/** + * The current recording state. + * + * @since v3.5.2 + */ +enum RecorderState { + /** + * -1: An error occurs during the recording. See RecorderReasonCode for the reason. + */ + RECORDER_STATE_ERROR = -1, + /** + * 2: The audio and video recording is started. + */ + RECORDER_STATE_START = 2, + /** + * 3: The audio and video recording is stopped. + */ + RECORDER_STATE_STOP = 3, +}; +/** + * The reason for the state change + * + * @since v3.5.2 + */ +enum RecorderReasonCode { + /** + * 0: No error occurs. + */ + RECORDER_REASON_NONE = 0, + /** + * 1: The SDK fails to write the recorded data to a file. + */ + RECORDER_REASON_WRITE_FAILED = 1, + /** + * 2: The SDK does not detect audio and video streams to be recorded, or audio and video streams are interrupted for more than five seconds during recording. + */ + RECORDER_REASON_NO_STREAM = 2, + /** + * 3: The recording duration exceeds the upper limit. + */ + RECORDER_REASON_OVER_MAX_DURATION = 3, + /** + * 4: The recording configuration changes. + */ + RECORDER_REASON_CONFIG_CHANGED = 4, +}; +/** + * Configurations for the local audio and video recording. + * + * @since v3.5.2 + */ +struct MediaRecorderConfiguration { + /** + * The absolute path (including the filename extensions) of the recording file. + * For example, `C:\Users\\AppData\Local\Agora\\example.mp4` on Windows, + * `/App Sandbox/Library/Caches/example.mp4` on iOS, `/Library/Logs/example.mp4` on macOS, and + * `/storage/emulated/0/Android/data//files/example.mp4` on Android. + * + * @note Ensure that the specified path exists and is writable. + */ + const char* storagePath; + /** + * The format of the recording file. See \ref agora::rtc::MediaRecorderContainerFormat "MediaRecorderContainerFormat". + */ + MediaRecorderContainerFormat containerFormat; + /** + * The recording content. See \ref agora::rtc::MediaRecorderStreamType "MediaRecorderStreamType". + */ + MediaRecorderStreamType streamType; + /** + * The maximum recording duration, in milliseconds. The default value is 120000. + */ + int maxDurationMs; + /** + * The interval (ms) of updating the recording information. The value range is + * [1000,10000]. Based on the set value of `recorderInfoUpdateInterval`, the + * SDK triggers the \ref IMediaRecorderObserver::onRecorderInfoUpdated "onRecorderInfoUpdated" + * callback to report the updated recording information. + */ + int recorderInfoUpdateInterval; + + MediaRecorderConfiguration() : storagePath(NULL), containerFormat(FORMAT_MP4), streamType(STREAM_TYPE_BOTH), maxDurationMs(120000), recorderInfoUpdateInterval(0) {} + MediaRecorderConfiguration(const char* path, MediaRecorderContainerFormat format, MediaRecorderStreamType type, int duration, int interval) : storagePath(path), containerFormat(format), streamType(type), maxDurationMs(duration), recorderInfoUpdateInterval(interval) {} +}; +/** + * Information for the recording file. + * + * @since v3.5.2 + */ +struct RecorderInfo { + /** + * The absolute path of the recording file. + */ + const char* fileName; + /** + * The recording duration, in milliseconds. + */ + unsigned int durationMs; + /** + * The size in bytes of the recording file. + */ + unsigned int fileSize; + + RecorderInfo() : fileName(NULL), durationMs(0), fileSize(0) {} + RecorderInfo(const char* name, unsigned int dur, unsigned int size) : fileName(name), durationMs(dur), fileSize(size) {} +}; + +class IMediaRecorderObserver { + public: + /** + * Occurs when the recording state changes. + * + * @since v4.0.0 + * + * When the local audio and video recording state changes, the SDK triggers this callback to report the current + * recording state and the reason for the change. + * + * @param channelId The channel name. + * @param uid ID of the user. + * @param state The current recording state. See \ref agora::media::RecorderState "RecorderState". + * @param reason The reason for the state change. See \ref agora::media::RecorderReasonCode "RecorderReasonCode". + */ + virtual void onRecorderStateChanged(const char* channelId, rtc::uid_t uid, RecorderState state, RecorderReasonCode reason) = 0; + /** + * Occurs when the recording information is updated. + * + * @since v4.0.0 + * + * After you successfully register this callback and enable the local audio and video recording, the SDK periodically triggers + * the `onRecorderInfoUpdated` callback based on the set value of `recorderInfoUpdateInterval`. This callback reports the + * filename, duration, and size of the current recording file. + * + * @param channelId The channel name. + * @param uid ID of the user. + * @param info Information about the recording file. See \ref agora::media::RecorderInfo "RecorderInfo". + * + */ + virtual void onRecorderInfoUpdated(const char* channelId, rtc::uid_t uid, const RecorderInfo& info) = 0; + + virtual ~IMediaRecorderObserver() {} +}; + +} // namespace media +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaPlayerTypes.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaPlayerTypes.h new file mode 100644 index 000000000..3beaba788 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraMediaPlayerTypes.h @@ -0,0 +1,516 @@ +// +// Agora Engine SDK +// +// Created by Sting Feng in 2020-05. +// Copyright (c) 2017 Agora.io. All rights reserved. + +#pragma once // NOLINT(build/header_guard) + +#include +#include + +#include "AgoraOptional.h" + +/** + * set analyze duration for real time stream + * @example "setPlayerOption(KEY_PLAYER_REAL_TIME_STREAM_ANALYZE_DURATION,1000000)" + */ +#define KEY_PLAYER_REAL_TIME_STREAM_ANALYZE_DURATION "analyze_duration" + +/** + * make the player to enable audio or not + * @example "setPlayerOption(KEY_PLAYER_ENABLE_AUDIO,0)" + */ +#define KEY_PLAYER_ENABLE_AUDIO "enable_audio" + +/** + * make the player to enable video or not + * @example "setPlayerOption(KEY_PLAYER_ENABLE_VIDEO,0)" + */ +#define KEY_PLAYER_ENABLE_VIDEO "enable_video" + +/** + * set the player enable to search metadata + * @example "setPlayerOption(KEY_PLAYER_DISABLE_SEARCH_METADATA,0)" + */ +#define KEY_PLAYER_ENABLE_SEARCH_METADATA "enable_search_metadata" + +/** + * set the player sei filter type + * @example "setPlayerOption(KEY_PLAYER_SEI_FILTER_TYPE,"5")" + */ +#define KEY_PLAYER_SEI_FILTER_TYPE "set_sei_filter_type" + +namespace agora { + +namespace media { + +namespace base { +static const uint8_t kMaxCharBufferLength = 50; +/** + * @brief The playback state. + * + */ +enum MEDIA_PLAYER_STATE { + /** Default state. + */ + PLAYER_STATE_IDLE = 0, + /** Opening the media file. + */ + PLAYER_STATE_OPENING, + /** The media file is opened successfully. + */ + PLAYER_STATE_OPEN_COMPLETED, + /** Playing the media file. + */ + PLAYER_STATE_PLAYING, + /** The playback is paused. + */ + PLAYER_STATE_PAUSED, + /** The playback is completed. + */ + PLAYER_STATE_PLAYBACK_COMPLETED, + /** All loops are completed. + */ + PLAYER_STATE_PLAYBACK_ALL_LOOPS_COMPLETED, + /** The playback is stopped. + */ + PLAYER_STATE_STOPPED, + /** Player pausing (internal) + */ + PLAYER_STATE_PAUSING_INTERNAL = 50, + /** Player stopping (internal) + */ + PLAYER_STATE_STOPPING_INTERNAL, + /** Player seeking state (internal) + */ + PLAYER_STATE_SEEKING_INTERNAL, + /** Player getting state (internal) + */ + PLAYER_STATE_GETTING_INTERNAL, + /** None state for state machine (internal) + */ + PLAYER_STATE_NONE_INTERNAL, + /** Do nothing state for state machine (internal) + */ + PLAYER_STATE_DO_NOTHING_INTERNAL, + /** Player set track state (internal) + */ + PLAYER_STATE_SET_TRACK_INTERNAL, + /** The playback fails. + */ + PLAYER_STATE_FAILED = 100, +}; +/** + * @brief Player error code + * + */ +enum MEDIA_PLAYER_REASON { + /** No error. + */ + PLAYER_REASON_NONE = 0, + /** The parameter is invalid. + */ + PLAYER_REASON_INVALID_ARGUMENTS = -1, + /** Internel error. + */ + PLAYER_REASON_INTERNAL = -2, + /** No resource. + */ + PLAYER_REASON_NO_RESOURCE = -3, + /** Invalid media source. + */ + PLAYER_REASON_INVALID_MEDIA_SOURCE = -4, + /** The type of the media stream is unknown. + */ + PLAYER_REASON_UNKNOWN_STREAM_TYPE = -5, + /** The object is not initialized. + */ + PLAYER_REASON_OBJ_NOT_INITIALIZED = -6, + /** The codec is not supported. + */ + PLAYER_REASON_CODEC_NOT_SUPPORTED = -7, + /** Invalid renderer. + */ + PLAYER_REASON_VIDEO_RENDER_FAILED = -8, + /** An error occurs in the internal state of the player. + */ + PLAYER_REASON_INVALID_STATE = -9, + /** The URL of the media file cannot be found. + */ + PLAYER_REASON_URL_NOT_FOUND = -10, + /** Invalid connection between the player and the Agora server. + */ + PLAYER_REASON_INVALID_CONNECTION_STATE = -11, + /** The playback buffer is insufficient. + */ + PLAYER_REASON_SRC_BUFFER_UNDERFLOW = -12, + /** The audio mixing file playback is interrupted. + */ + PLAYER_REASON_INTERRUPTED = -13, + /** The SDK does not support this function. + */ + PLAYER_REASON_NOT_SUPPORTED = -14, + /** The token has expired. + */ + PLAYER_REASON_TOKEN_EXPIRED = -15, + /** The ip has expired. + */ + PLAYER_REASON_IP_EXPIRED = -16, + /** An unknown error occurs. + */ + PLAYER_REASON_UNKNOWN = -17, +}; + +/** + * @brief The type of the media stream. + * + */ +enum MEDIA_STREAM_TYPE { + /** The type is unknown. + */ + STREAM_TYPE_UNKNOWN = 0, + /** The video stream. + */ + STREAM_TYPE_VIDEO = 1, + /** The audio stream. + */ + STREAM_TYPE_AUDIO = 2, + /** The subtitle stream. + */ + STREAM_TYPE_SUBTITLE = 3, +}; + +/** + * @brief The playback event. + * + */ +enum MEDIA_PLAYER_EVENT { + /** The player begins to seek to the new playback position. + */ + PLAYER_EVENT_SEEK_BEGIN = 0, + /** The seek operation completes. + */ + PLAYER_EVENT_SEEK_COMPLETE = 1, + /** An error occurs during the seek operation. + */ + PLAYER_EVENT_SEEK_ERROR = 2, + /** The player changes the audio track for playback. + */ + PLAYER_EVENT_AUDIO_TRACK_CHANGED = 5, + /** player buffer low + */ + PLAYER_EVENT_BUFFER_LOW = 6, + /** player buffer recover + */ + PLAYER_EVENT_BUFFER_RECOVER = 7, + /** The video or audio is interrupted + */ + PLAYER_EVENT_FREEZE_START = 8, + /** Interrupt at the end of the video or audio + */ + PLAYER_EVENT_FREEZE_STOP = 9, + /** switch source begin + */ + PLAYER_EVENT_SWITCH_BEGIN = 10, + /** switch source complete + */ + PLAYER_EVENT_SWITCH_COMPLETE = 11, + /** switch source error + */ + PLAYER_EVENT_SWITCH_ERROR = 12, + /** An application can render the video to less than a second + */ + PLAYER_EVENT_FIRST_DISPLAYED = 13, + /** cache resources exceed the maximum file count + */ + PLAYER_EVENT_REACH_CACHE_FILE_MAX_COUNT = 14, + /** cache resources exceed the maximum file size + */ + PLAYER_EVENT_REACH_CACHE_FILE_MAX_SIZE = 15, + /** Triggered when a retry is required to open the media + */ + PLAYER_EVENT_TRY_OPEN_START = 16, + /** Triggered when the retry to open the media is successful + */ + PLAYER_EVENT_TRY_OPEN_SUCCEED = 17, + /** Triggered when retrying to open media fails + */ + PLAYER_EVENT_TRY_OPEN_FAILED = 18, +}; + +/** + * @brief The play preload another source event. + * + */ +enum PLAYER_PRELOAD_EVENT { + /** preload source begin + */ + PLAYER_PRELOAD_EVENT_BEGIN = 0, + /** preload source complete + */ + PLAYER_PRELOAD_EVENT_COMPLETE = 1, + /** preload source error + */ + PLAYER_PRELOAD_EVENT_ERROR = 2, +}; + +/** + * @brief The information of the media stream object. + * + */ +struct PlayerStreamInfo { + /** The index of the media stream. */ + int streamIndex; + + /** The type of the media stream. See {@link MEDIA_STREAM_TYPE}. */ + MEDIA_STREAM_TYPE streamType; + + /** The codec of the media stream. */ + char codecName[kMaxCharBufferLength]; + + /** The language of the media stream. */ + char language[kMaxCharBufferLength]; + + /** The frame rate (fps) if the stream is video. */ + int videoFrameRate; + + /** The video bitrate (bps) if the stream is video. */ + int videoBitRate; + + /** The video width (pixel) if the stream is video. */ + int videoWidth; + + /** The video height (pixel) if the stream is video. */ + int videoHeight; + + /** The rotation angle if the steam is video. */ + int videoRotation; + + /** The sample rate if the stream is audio. */ + int audioSampleRate; + + /** The number of audio channels if the stream is audio. */ + int audioChannels; + + /** The number of bits per sample if the stream is audio. */ + int audioBitsPerSample; + + /** The total duration (millisecond) of the media stream. */ + int64_t duration; + + PlayerStreamInfo() : streamIndex(0), + streamType(STREAM_TYPE_UNKNOWN), + videoFrameRate(0), + videoBitRate(0), + videoWidth(0), + videoHeight(0), + videoRotation(0), + audioSampleRate(0), + audioChannels(0), + audioBitsPerSample(0), + duration(0) { + memset(codecName, 0, sizeof(codecName)); + memset(language, 0, sizeof(language)); + } +}; + +/** + * @brief The information of the media stream object. + * + */ +struct SrcInfo { + /** The bitrate of the media stream. The unit of the number is kbps. + * + */ + int bitrateInKbps; + + /** The name of the media stream. + * + */ + const char* name; + +}; + +/** + * @brief The type of the media metadata. + * + */ +enum MEDIA_PLAYER_METADATA_TYPE { + /** The type is unknown. + */ + PLAYER_METADATA_TYPE_UNKNOWN = 0, + /** The type is SEI. + */ + PLAYER_METADATA_TYPE_SEI = 1, +}; + +struct CacheStatistics { + /** total data size of uri + */ + int64_t fileSize; + /** data of uri has cached + */ + int64_t cacheSize; + /** data of uri has downloaded + */ + int64_t downloadSize; +}; + +/** + * @brief The real time statistics of the media stream being played. + * + */ +struct PlayerPlaybackStats { + /** Video fps. + */ + int videoFps; + /** Video bitrate (Kbps). + */ + int videoBitrateInKbps; + /** Audio bitrate (Kbps). + */ + int audioBitrateInKbps; + /** Total bitrate (Kbps). + */ + int totalBitrateInKbps; +}; + +/** + * @brief The updated information of media player. + * + */ +struct PlayerUpdatedInfo { + /** @technical preview + */ + const char* internalPlayerUuid; + /** The device ID of the playback device. + */ + const char* deviceId; + /** Video height. + */ + int videoHeight; + /** Video width. + */ + int videoWidth; + /** Audio sample rate. + */ + int audioSampleRate; + /** The audio channel number. + */ + int audioChannels; + /** The bit number of each audio sample. + */ + int audioBitsPerSample; + + PlayerUpdatedInfo() + : internalPlayerUuid(NULL), + deviceId(NULL), + videoHeight(0), + videoWidth(0), + audioSampleRate(0), + audioChannels(0), + audioBitsPerSample(0) {} +}; + +/** + * The custom data source provides a data stream input callback, and the player will continue to call back this interface, requesting the user to fill in the data that needs to be played. + */ +class IMediaPlayerCustomDataProvider { +public: + + /** + * @brief The player requests to read the data callback, you need to fill the specified length of data into the buffer + * @param buffer the buffer pointer that you need to fill data. + * @param bufferSize the bufferSize need to fill of the buffer pointer. + * @return you need return offset value if succeed. return 0 if failed. + */ + virtual int onReadData(unsigned char *buffer, int bufferSize) = 0; + + /** + * @brief The Player seek event callback, you need to operate the corresponding stream seek operation, You can refer to the definition of lseek() at https://man7.org/linux/man-pages/man2/lseek.2.html + * @param offset the value of seek offset. + * @param whence the postion of start seeking, the directive whence as follows: + * 0 - SEEK_SET : The file offset is set to offset bytes. + * 1 - SEEK_CUR : The file offset is set to its current location plus offset bytes. + * 2 - SEEK_END : The file offset is set to the size of the file plus offset bytes. + * 65536 - AVSEEK_SIZE : Optional. Passing this as the "whence" parameter to a seek function causes it to return the filesize without seeking anywhere. + * @return + * whence == 65536, return filesize if you need. + * whence >= 0 && whence < 3 , return offset value if succeed. return -1 if failed. + */ + virtual int64_t onSeek(int64_t offset, int whence) = 0; + + virtual ~IMediaPlayerCustomDataProvider() {} +}; + +struct MediaSource { + /** + * The URL of the media file that you want to play. + */ + const char* url; + /** + * The URI of the media file + * + * When caching is enabled, if the url cannot distinguish the cache file name, + * the uri must be able to ensure that the cache file name corresponding to the url is unique. + */ + const char* uri; + /** + * Set the starting position for playback, in ms. + */ + int64_t startPos; + /** + * Determines whether to autoplay after opening a media resource. + * - true: (Default) Autoplay after opening a media resource. + * - false: Do not autoplay after opening a media resource. + */ + bool autoPlay; + /** + * Determines whether to enable cache streaming to local files. If enable cached, the media player will + * use the url or uri as the cache index. + * + * @note + * The local cache function only supports on-demand video/audio streams and does not support live streams. + * Caching video and audio files based on the HLS protocol (m3u8) to your local device is not supported. + * + * - true: Enable cache. + * - false: (Default) Disable cache. + */ + bool enableCache; + /** + * Determines whether to enable multi-track audio stream decoding. + * Then you can select multi audio track of the media file for playback or publish to channel + * + * @note + * If you use the selectMultiAudioTrack API, you must set enableMultiAudioTrack to true. + * + * - true: Enable MultiAudioTrack;. + * - false: (Default) Disable MultiAudioTrack;. + */ + bool enableMultiAudioTrack; + /** + * Determines whether the opened media resource is a stream through the Agora Broadcast Streaming Network(CDN). + * - true: It is a stream through the Agora Broadcast Streaming Network. + * - false: (Default) It is not a stream through the Agora Broadcast Streaming Network. + */ + Optional isAgoraSource; + /** + * Determines whether the opened media resource is a live stream. If is a live stream, it can speed up the opening of media resources. + * - true: It is a live stream. + * - false: (Default) It is not is a live stream. + */ + Optional isLiveSource; + /** + * External custom data source object + */ + IMediaPlayerCustomDataProvider* provider; + + MediaSource() : url(NULL), uri(NULL), startPos(0), autoPlay(true), enableCache(false), + enableMultiAudioTrack(false), provider(NULL){ + } +}; + +} // namespace base +} // namespace media +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraOptional.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraOptional.h new file mode 100644 index 000000000..97595be45 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraOptional.h @@ -0,0 +1,891 @@ +// Copyright (c) 2019 Agora.io. All rights reserved + +// This program is confidential and proprietary to Agora.io. +// And may not be copied, reproduced, modified, disclosed to others, published +// or used, in whole or in part, without the express prior written permission +// of Agora.io. +#pragma once + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#include +#endif +#include + +#ifndef CONSTEXPR +#if __cplusplus >= 201103L || (defined(_MSVC_LANG) && _MSVC_LANG >= 201103L) +#define CONSTEXPR constexpr +#else +#define CONSTEXPR +#endif +#endif // !CONSTEXPR + +#ifndef NOEXCEPT +#if __cplusplus >= 201103L || (defined(_MSVC_LANG) && _MSVC_LANG >= 201103L) +#define NOEXCEPT(Expr) noexcept(Expr) +#else +#define NOEXCEPT(Expr) +#endif +#endif // !NOEXCEPT + +namespace agora { + +// Specification: +// http://en.cppreference.com/w/cpp/utility/optional/in_place_t +struct in_place_t {}; + +// Specification: +// http://en.cppreference.com/w/cpp/utility/optional/nullopt_t +struct nullopt_t { + CONSTEXPR explicit nullopt_t(int) {} +}; + +// Specification: +// http://en.cppreference.com/w/cpp/utility/optional/in_place +/*CONSTEXPR*/ const in_place_t in_place = {}; + +// Specification: +// http://en.cppreference.com/w/cpp/utility/optional/nullopt +/*CONSTEXPR*/ const nullopt_t nullopt(0); + +// Forward declaration, which is refered by following helpers. +template +class Optional; + +namespace internal { + +template +struct OptionalStorageBase { + // Initializing |empty_| here instead of using default member initializing + // to avoid errors in g++ 4.8. + CONSTEXPR OptionalStorageBase() : is_populated_(false), empty_('\0') {} + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + CONSTEXPR explicit OptionalStorageBase(in_place_t, Args&&... args) + : is_populated_(true), value_(std::forward(args)...) {} +#else + CONSTEXPR explicit OptionalStorageBase(in_place_t, const T& _value) + : is_populated_(true), value_(_value) {} +#endif + // When T is not trivially destructible we must call its + // destructor before deallocating its memory. + // Note that this hides the (implicitly declared) move constructor, which + // would be used for constexpr move constructor in OptionalStorage. + // It is needed iff T is trivially move constructible. However, the current + // is_trivially_{copy,move}_constructible implementation requires + // is_trivially_destructible (which looks a bug, cf: + // https://gcc.gnu.org/bugzilla/show_bug.cgi?id=51452 and + // http://cplusplus.github.io/LWG/lwg-active.html#2116), so it is not + // necessary for this case at the moment. Please see also the destructor + // comment in "is_trivially_destructible = true" specialization below. + ~OptionalStorageBase() { + if (is_populated_) + value_.~T(); + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + void Init(Args&&... args) { + ::new (&value_) T(std::forward(args)...); + is_populated_ = true; + } +#else + void Init(const T& _value) { + ::new (&value_) T(_value); + is_populated_ = true; + } +#endif + + bool is_populated_; + + union { + // |empty_| exists so that the union will always be initialized, even when + // it doesn't contain a value. Union members must be initialized for the + // constructor to be 'constexpr'. + char empty_; + T value_; + }; +}; + +// Implement conditional constexpr copy and move constructors. These are +// constexpr if is_trivially_{copy,move}_constructible::value is true +// respectively. If each is true, the corresponding constructor is defined as +// "= default;", which generates a constexpr constructor (In this case, +// the condition of constexpr-ness is satisfied because the base class also has +// compiler generated constexpr {copy,move} constructors). Note that +// placement-new is prohibited in constexpr. +template +struct OptionalStorage : OptionalStorageBase { + // This is no trivially {copy,move} constructible case. Other cases are + // defined below as specializations. + + // Accessing the members of template base class requires explicit + // declaration. + using OptionalStorageBase::is_populated_; + using OptionalStorageBase::value_; + using OptionalStorageBase::Init; + + // Inherit constructors (specifically, the in_place constructor). + //using OptionalStorageBase::OptionalStorageBase; + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + CONSTEXPR explicit OptionalStorage(in_place_t in_place, Args&&... args) + : OptionalStorageBase(in_place, std::forward(args)...) {} +#else + CONSTEXPR explicit OptionalStorage(in_place_t in_place, const T& _value) + : OptionalStorageBase(in_place, _value) {} +#endif + + // User defined constructor deletes the default constructor. + // Define it explicitly. + OptionalStorage() {} + + OptionalStorage(const OptionalStorage& other) { + if (other.is_populated_) + Init(other.value_); + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + OptionalStorage(OptionalStorage&& other) NOEXCEPT(std::is_nothrow_move_constructible::value) { + if (other.is_populated_) + Init(std::move(other.value_)); + } +#endif +}; + +// Base class to support conditionally usable copy-/move- constructors +// and assign operators. +template +class OptionalBase { + // This class provides implementation rather than public API, so everything + // should be hidden. Often we use composition, but we cannot in this case + // because of C++ language restriction. + protected: + CONSTEXPR OptionalBase() {} + CONSTEXPR OptionalBase(const OptionalBase& other) : storage_(other.storage_) {} +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR OptionalBase(OptionalBase&& other) : storage_(std::move(other.storage_)) {} + + template + CONSTEXPR explicit OptionalBase(in_place_t, Args&&... args) + : storage_(in_place, std::forward(args)...) {} +#else + CONSTEXPR explicit OptionalBase(in_place_t, const T& _value) + : storage_(in_place, _value) {} +#endif + + // Implementation of converting constructors. + template + explicit OptionalBase(const OptionalBase& other) { + if (other.storage_.is_populated_) + storage_.Init(other.storage_.value_); + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + explicit OptionalBase(OptionalBase&& other) { + if (other.storage_.is_populated_) + storage_.Init(std::move(other.storage_.value_)); + } +#endif + + ~OptionalBase() {} + + OptionalBase& operator=(const OptionalBase& other) { + CopyAssign(other); + return *this; + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + OptionalBase& operator=(OptionalBase&& other) NOEXCEPT( + std::is_nothrow_move_assignable::value && + std::is_nothrow_move_constructible::value) { + MoveAssign(std::move(other)); + return *this; + } +#endif + + template + void CopyAssign(const OptionalBase& other) { + if (other.storage_.is_populated_) + InitOrAssign(other.storage_.value_); + else + FreeIfNeeded(); + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + void MoveAssign(OptionalBase&& other) { + if (other.storage_.is_populated_) + InitOrAssign(std::move(other.storage_.value_)); + else + FreeIfNeeded(); + } +#endif + + template +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + void InitOrAssign(U&& value) { + if (storage_.is_populated_) + storage_.value_ = std::forward(value); + else + storage_.Init(std::forward(value)); + } +#else + void InitOrAssign(const U& value) { + if (storage_.is_populated_) + storage_.value_ = value; + else + storage_.Init(value); + } +#endif + + + void FreeIfNeeded() { + if (!storage_.is_populated_) + return; + storage_.value_.~T(); + storage_.is_populated_ = false; + } + + // For implementing conversion, allow access to other typed OptionalBase + // class. + template + friend class OptionalBase; + + OptionalStorage storage_; +}; + +// The following {Copy,Move}{Constructible,Assignable} structs are helpers to +// implement constructor/assign-operator overloading. Specifically, if T is +// is not movable but copyable, Optional's move constructor should not +// participate in overload resolution. This inheritance trick implements that. +template +struct CopyConstructible {}; + +template <> +struct CopyConstructible { + CONSTEXPR CopyConstructible() {} + CopyConstructible& operator=(const CopyConstructible&) { return *this; } +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR CopyConstructible(CopyConstructible&&) {} + CopyConstructible& operator=(CopyConstructible&&) { return *this; } +#endif + private: + CONSTEXPR CopyConstructible(const CopyConstructible&); +}; + +template +struct MoveConstructible {}; + +template <> +struct MoveConstructible { + CONSTEXPR MoveConstructible() {} + CONSTEXPR MoveConstructible(const MoveConstructible&) {} + MoveConstructible& operator=(const MoveConstructible&) { return *this; } +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + MoveConstructible& operator=(MoveConstructible&&) { return *this; } + private: + CONSTEXPR MoveConstructible(MoveConstructible&&); +#endif +}; + +template +struct CopyAssignable {}; + +template <> +struct CopyAssignable { + CONSTEXPR CopyAssignable() {} + CONSTEXPR CopyAssignable(const CopyAssignable&) {} +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR CopyAssignable(CopyAssignable&&) {} + CopyAssignable& operator=(CopyAssignable&&) { return *this; } +#endif + private: + CopyAssignable& operator=(const CopyAssignable&); +}; + +template +struct MoveAssignable {}; + +template <> +struct MoveAssignable { + CONSTEXPR MoveAssignable() {} + CONSTEXPR MoveAssignable(const MoveAssignable&) {} + MoveAssignable& operator=(const MoveAssignable&) { return *this; } +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR MoveAssignable(MoveAssignable&&) {} + + private: + MoveAssignable& operator=(MoveAssignable&&); +#endif +}; + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +// Helper to conditionally enable converting constructors and assign operators. +template +struct IsConvertibleFromOptional + : std::integral_constant< + bool, + std::is_constructible&>::value || + std::is_constructible&>::value || + std::is_constructible&&>::value || + std::is_constructible&&>::value || + std::is_convertible&, T>::value || + std::is_convertible&, T>::value || + std::is_convertible&&, T>::value || + std::is_convertible&&, T>::value> {}; + +template +struct IsAssignableFromOptional + : std::integral_constant< + bool, + IsConvertibleFromOptional::value || + std::is_assignable&>::value || + std::is_assignable&>::value || + std::is_assignable&&>::value || + std::is_assignable&&>::value> {}; + +// Forward compatibility for C++17. +// Introduce one more deeper nested namespace to avoid leaking using std::swap. +namespace swappable_impl { +using std::swap; + +struct IsSwappableImpl { + // Tests if swap can be called. Check(0) returns true_type iff swap + // is available for T. Otherwise, Check's overload resolution falls back + // to Check(...) declared below thanks to SFINAE, so returns false_type. + template + static auto Check(int) + -> decltype(swap(std::declval(), std::declval()), std::true_type()); + + template + static std::false_type Check(...); +}; +} // namespace swappable_impl +template +struct IsSwappable : decltype(swappable_impl::IsSwappableImpl::Check(0)) {}; +#endif +} // namespace internal + +// On Windows, by default, empty-base class optimization does not work, +// which means even if the base class is empty struct, it still consumes one +// byte for its body. __declspec(empty_bases) enables the optimization. +// cf) +// https://blogs.msdn.microsoft.com/vcblog/2016/03/30/optimizing-the-layout-of-empty-base-classes-in-vs2015-update-2-3/ +#if defined(_WIN32) +#define OPTIONAL_DECLSPEC_EMPTY_BASES __declspec(empty_bases) +#else +#define OPTIONAL_DECLSPEC_EMPTY_BASES +#endif + +// Optional is a Chromium version of the C++17 optional class: +// std::optional documentation: +// http://en.cppreference.com/w/cpp/utility/optional +// Chromium documentation: +// https://chromium.googlesource.com/chromium/src/+/master/docs/optional.md +// +// These are the differences between the specification and the implementation: +// - Constructors do not use 'constexpr' as it is a C++14 extension. +// - 'constexpr' might be missing in some places for reasons specified locally. +// - No exceptions are thrown, because they are banned from Chromium. +// Marked noexcept for only move constructor and move assign operators. +// - All the non-members are in the 'base' namespace instead of 'std'. +// +// Note that T cannot have a constructor T(Optional) etc. Optional checks +// T's constructor (specifically via IsConvertibleFromOptional), and in the +// check whether T can be constructible from Optional, which is recursive +// so it does not work. As of Feb 2018, std::optional C++17 implementation in +// both clang and gcc has same limitation. MSVC SFINAE looks to have different +// behavior, but anyway it reports an error, too. +template +class OPTIONAL_DECLSPEC_EMPTY_BASES Optional + : public internal::OptionalBase +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + , public internal::CopyConstructible::value>, + public internal::MoveConstructible::value>, + public internal::CopyAssignable::value && + std::is_copy_assignable::value>, + public internal::MoveAssignable::value && + std::is_move_assignable::value> +#endif +{ + public: +#undef OPTIONAL_DECLSPEC_EMPTY_BASES + + typedef T value_type; + + // Defer default/copy/move constructor implementation to OptionalBase. + CONSTEXPR Optional() {} + CONSTEXPR Optional(const Optional& other) : internal::OptionalBase(other) {} + + CONSTEXPR Optional(nullopt_t) {} // NOLINT(runtime/explicit) + + // Converting copy constructor. "explicit" only if + // std::is_convertible::value is false. It is implemented by + // declaring two almost same constructors, but that condition in enable_if_t + // is different, so that either one is chosen, thanks to SFINAE. + template + Optional(const Optional& other) : internal::OptionalBase(other) {} + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + // Converting move constructor. Similar to converting copy constructor, + // declaring two (explicit and non-explicit) constructors. + template + Optional(Optional&& other) : internal::OptionalBase(std::move(other)) {} + + template + CONSTEXPR explicit Optional(in_place_t, Args&&... args) + : internal::OptionalBase(in_place, std::forward(args)...) {} + + template + CONSTEXPR explicit Optional(in_place_t, + std::initializer_list il, + Args&&... args) + : internal::OptionalBase(in_place, il, std::forward(args)...) {} +#else + CONSTEXPR explicit Optional(in_place_t, const T& _value) + : internal::OptionalBase(in_place, _value) {} + template + CONSTEXPR explicit Optional(in_place_t, + const U il[], + const T& _value) + : internal::OptionalBase(in_place, il, _value) {} +#endif + + // Forward value constructor. Similar to converting constructors, + // conditionally explicit. +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + CONSTEXPR Optional(U&& value) + : internal::OptionalBase(in_place, std::forward(value)) {} +#else + template + CONSTEXPR Optional(const U& value) + : internal::OptionalBase(in_place, value) {} +#endif + + ~Optional() {} + + // Defer copy-/move- assign operator implementation to OptionalBase. + Optional& operator=(const Optional& other) { + if (&other == this) { + return *this; + } + + internal::OptionalBase::operator=(other); + return *this; + } + + Optional& operator=(nullopt_t) { + FreeIfNeeded(); + return *this; + } + + // Perfect-forwarded assignment. + template +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + Optional& operator=(U&& value) { + InitOrAssign(std::forward(value)); + return *this; + } +#else + Optional& operator=(const U& value) { + InitOrAssign(value); + return *this; + } +#endif + + // Copy assign the state of other. + template + Optional& operator=(const Optional& other) { + CopyAssign(other); + return *this; + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + // Move assign the state of other. + template + Optional& operator=(Optional&& other) { + MoveAssign(std::move(other)); + return *this; + } +#endif + + const T* operator->() const { + return &storage_.value_; + } + + T* operator->() { + return &storage_.value_; + } + + const T& operator*() const { + return storage_.value_; + } + + T& operator*() { + return storage_.value_; + } + + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR explicit operator bool() const { return storage_.is_populated_; } +#else + CONSTEXPR operator bool() const { return storage_.is_populated_; } +#endif + + CONSTEXPR bool has_value() const { return storage_.is_populated_; } + +#if 1 + const T& value() const { + return storage_.value_; + } + + template +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR T value_or(U&& default_value) const { + // TODO(mlamouri): add the following assert when possible: + // static_assert(std::is_copy_constructible::value, + // "T must be copy constructible"); + static_assert(std::is_convertible::value, + "U must be convertible to T"); + return storage_.is_populated_ + ? value() + : static_cast(std::forward(default_value)); + } +#else + CONSTEXPR T value_or(const U& default_value) const { + return storage_.is_populated_ + ? value() + : static_cast(default_value); + } +#endif +#else + const T& value() const & { + return storage_.value_; + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + const T&& value() const && { + return std::move(storage_.value_); + } +#endif + + template +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR T value_or(U&& default_value) const & { + // TODO(mlamouri): add the following assert when possible: + // static_assert(std::is_copy_constructible::value, + // "T must be copy constructible"); + static_assert(std::is_convertible::value, + "U must be convertible to T"); + return storage_.is_populated_ + ? value() + : static_cast(std::forward(default_value)); + } +#else + CONSTEXPR T value_or(const U& default_value) const & { + // TODO(mlamouri): add the following assert when possible: + // static_assert(std::is_copy_constructible::value, + // "T must be copy constructible"); + static_assert(std::is_convertible::value, + "U must be convertible to T"); + return storage_.is_populated_ + ? value() + : static_cast(default_value); + } +#endif + + template +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + CONSTEXPR T value_or(U&& default_value) const && { + // TODO(mlamouri): add the following assert when possible: + // static_assert(std::is_move_constructible::value, + // "T must be move constructible"); + static_assert(std::is_convertible::value, + "U must be convertible to T"); + return storage_.is_populated_ + ? std::move(value()) + : static_cast(std::forward(default_value)); + } +#endif +#endif // 1 + + void swap(Optional& other) { + if (!storage_.is_populated_ && !other.storage_.is_populated_) + return; + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + if (storage_.is_populated_ != other.storage_.is_populated_) { + if (storage_.is_populated_) { + other.storage_.Init(std::move(storage_.value_)); + FreeIfNeeded(); + } else { + storage_.Init(std::move(other.storage_.value_)); + other.FreeIfNeeded(); + } + return; + } +#endif + using std::swap; + swap(**this, *other); + } + + void reset() { FreeIfNeeded(); } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + template + T& emplace(Args&&... args) { + FreeIfNeeded(); + storage_.Init(std::forward(args)...); + return storage_.value_; + } + + template + T& emplace(std::initializer_list il, Args&&... args) { + FreeIfNeeded(); + storage_.Init(il, std::forward(args)...); + return storage_.value_; + } +#else + T& emplace(const T& _value) { + FreeIfNeeded(); + storage_.Init(_value); + return storage_.value_; + } + template + T& emplace(const U il[], const T& _value) { + FreeIfNeeded(); + storage_.Init(il, _value); + return storage_.value_; + } +#endif + + private: + // Accessing template base class's protected member needs explicit + // declaration to do so. + using internal::OptionalBase::CopyAssign; + using internal::OptionalBase::FreeIfNeeded; + using internal::OptionalBase::InitOrAssign; +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + using internal::OptionalBase::MoveAssign; +#endif + using internal::OptionalBase::storage_; +}; + +// Here after defines comparation operators. The definition follows +// http://en.cppreference.com/w/cpp/utility/optional/operator_cmp +// while bool() casting is replaced by has_value() to meet the chromium +// style guide. +template +bool operator==(const Optional& lhs, const Optional& rhs) { + if (lhs.has_value() != rhs.has_value()) + return false; + if (!lhs.has_value()) + return true; + return *lhs == *rhs; +} + +template +bool operator!=(const Optional& lhs, const Optional& rhs) { + if (lhs.has_value() != rhs.has_value()) + return true; + if (!lhs.has_value()) + return false; + return *lhs != *rhs; +} + +template +bool operator<(const Optional& lhs, const Optional& rhs) { + if (!rhs.has_value()) + return false; + if (!lhs.has_value()) + return true; + return *lhs < *rhs; +} + +template +bool operator<=(const Optional& lhs, const Optional& rhs) { + if (!lhs.has_value()) + return true; + if (!rhs.has_value()) + return false; + return *lhs <= *rhs; +} + +template +bool operator>(const Optional& lhs, const Optional& rhs) { + if (!lhs.has_value()) + return false; + if (!rhs.has_value()) + return true; + return *lhs > *rhs; +} + +template +bool operator>=(const Optional& lhs, const Optional& rhs) { + if (!rhs.has_value()) + return true; + if (!lhs.has_value()) + return false; + return *lhs >= *rhs; +} + +template +CONSTEXPR bool operator==(const Optional& opt, nullopt_t) { + return !opt; +} + +template +CONSTEXPR bool operator==(nullopt_t, const Optional& opt) { + return !opt; +} + +template +CONSTEXPR bool operator!=(const Optional& opt, nullopt_t) { + return opt.has_value(); +} + +template +CONSTEXPR bool operator!=(nullopt_t, const Optional& opt) { + return opt.has_value(); +} + +template +CONSTEXPR bool operator<(const Optional& , nullopt_t) { + return false; +} + +template +CONSTEXPR bool operator<(nullopt_t, const Optional& opt) { + return opt.has_value(); +} + +template +CONSTEXPR bool operator<=(const Optional& opt, nullopt_t) { + return !opt; +} + +template +CONSTEXPR bool operator<=(nullopt_t, const Optional& ) { + return true; +} + +template +CONSTEXPR bool operator>(const Optional& opt, nullopt_t) { + return opt.has_value(); +} + +template +CONSTEXPR bool operator>(nullopt_t, const Optional& ) { + return false; +} + +template +CONSTEXPR bool operator>=(const Optional& , nullopt_t) { + return true; +} + +template +CONSTEXPR bool operator>=(nullopt_t, const Optional& opt) { + return !opt; +} + +template +CONSTEXPR bool operator==(const Optional& opt, const U& value) { + return opt.has_value() ? *opt == value : false; +} + +template +CONSTEXPR bool operator==(const U& value, const Optional& opt) { + return opt.has_value() ? value == *opt : false; +} + +template +CONSTEXPR bool operator!=(const Optional& opt, const U& value) { + return opt.has_value() ? *opt != value : true; +} + +template +CONSTEXPR bool operator!=(const U& value, const Optional& opt) { + return opt.has_value() ? value != *opt : true; +} + +template +CONSTEXPR bool operator<(const Optional& opt, const U& value) { + return opt.has_value() ? *opt < value : true; +} + +template +CONSTEXPR bool operator<(const U& value, const Optional& opt) { + return opt.has_value() ? value < *opt : false; +} + +template +CONSTEXPR bool operator<=(const Optional& opt, const U& value) { + return opt.has_value() ? *opt <= value : true; +} + +template +CONSTEXPR bool operator<=(const U& value, const Optional& opt) { + return opt.has_value() ? value <= *opt : false; +} + +template +CONSTEXPR bool operator>(const Optional& opt, const U& value) { + return opt.has_value() ? *opt > value : false; +} + +template +CONSTEXPR bool operator>(const U& value, const Optional& opt) { + return opt.has_value() ? value > *opt : true; +} + +template +CONSTEXPR bool operator>=(const Optional& opt, const U& value) { + return opt.has_value() ? *opt >= value : false; +} + +template +CONSTEXPR bool operator>=(const U& value, const Optional& opt) { + return opt.has_value() ? value >= *opt : true; +} + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +template +CONSTEXPR Optional make_optional(Args&&... args) { + return Optional(in_place, std::forward(args)...); +} + +template +CONSTEXPR Optional make_optional(std::initializer_list il, + Args&&... args) { + return Optional(in_place, il, std::forward(args)...); +} +#endif + +// Partial specialization for a function template is not allowed. Also, it is +// not allowed to add overload function to std namespace, while it is allowed +// to specialize the template in std. Thus, swap() (kind of) overloading is +// defined in base namespace, instead. +template +void swap(Optional& lhs, Optional& rhs) { + lhs.swap(rhs); +} + +} // namespace agora + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +namespace std { +template +struct hash > { + size_t operator()(const agora::Optional& opt) const { + return opt == agora::nullopt ? 0 : std::hash()(*opt); + } +}; +} // namespace std +#endif +#undef CONSTEXPR +#undef NOEXCEPT diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraRefPtr.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraRefPtr.h new file mode 100644 index 000000000..97594cb87 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/AgoraRefPtr.h @@ -0,0 +1,156 @@ + +// Copyright (c) 2019 Agora.io. All rights reserved + +// This program is confidential and proprietary to Agora.io. +// And may not be copied, reproduced, modified, disclosed to others, published +// or used, in whole or in part, without the express prior written permission +// of Agora.io. + +#pragma once + +#include +#if !(__cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800)) +#include +#endif +#ifndef OPTIONAL_ENUM_CLASS +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#define OPTIONAL_ENUM_CLASS enum class +#else +#define OPTIONAL_ENUM_CLASS enum +#endif +#endif + +namespace agora { + +OPTIONAL_ENUM_CLASS RefCountReleaseStatus { kDroppedLastRef, kOtherRefsRemained }; + +// Interfaces where refcounting is part of the public api should +// inherit this abstract interface. The implementation of these +// methods is usually provided by the RefCountedObject template class, +// applied as a leaf in the inheritance tree. +class RefCountInterface { + public: + virtual void AddRef() const = 0; + virtual RefCountReleaseStatus Release() const = 0; + virtual bool HasOneRef() const = 0; + + // Non-public destructor, because Release() has exclusive responsibility for + // destroying the object. + protected: + virtual ~RefCountInterface() {} +}; + +template +class agora_refptr { + public: + agora_refptr() : ptr_(NULL) {} + + agora_refptr(T* p) : ptr_(p) { + if (ptr_) ptr_->AddRef(); + } + + template + agora_refptr(U* p) : ptr_(p) { + if (ptr_) ptr_->AddRef(); + } + + agora_refptr(const agora_refptr& r) : ptr_(r.get()) { + if (ptr_) ptr_->AddRef(); + } + + template + agora_refptr(const agora_refptr& r) : ptr_(r.get()) { + if (ptr_) ptr_->AddRef(); + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + agora_refptr(agora_refptr&& r) : ptr_(r.move()) {} + + template + agora_refptr(agora_refptr&& r) : ptr_(r.move()) {} +#endif + + ~agora_refptr() { + reset(); + } + + T* get() const { return ptr_; } + operator bool() const { return (ptr_ != NULL); } + + T* operator->() const { return ptr_; } + T& operator*() const { return *ptr_; } + + // Returns the (possibly null) raw pointer, and makes the agora_refptr hold a + // null pointer, all without touching the reference count of the underlying + // pointed-to object. The object is still reference counted, and the caller of + // move() is now the proud owner of one reference, so it is responsible for + // calling Release() once on the object when no longer using it. + T* move() { + T* retVal = ptr_; + ptr_ = NULL; + return retVal; + } + + agora_refptr& operator=(T* p) { + if (ptr_ == p) return *this; + + if (p) p->AddRef(); + if (ptr_) ptr_->Release(); + ptr_ = p; + return *this; + } + + agora_refptr& operator=(const agora_refptr& r) { + return *this = r.get(); + } + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) + agora_refptr& operator=(agora_refptr&& r) { + agora_refptr(std::move(r)).swap(*this); + return *this; + } + + template + agora_refptr& operator=(agora_refptr&& r) { + agora_refptr(std::move(r)).swap(*this); + return *this; + } +#endif + + // For working with std::find() + bool operator==(const agora_refptr& r) const { return ptr_ == r.ptr_; } + + // For working with std::set + bool operator<(const agora_refptr& r) const { return ptr_ < r.ptr_; } + + void swap(T** pp) { + T* p = ptr_; + ptr_ = *pp; + *pp = p; + } + + void swap(agora_refptr& r) { swap(&r.ptr_); } + + void reset() { + if (ptr_) { + ptr_->Release(); + ptr_ = NULL; + } + } + + protected: + T* ptr_; +}; + +} // namespace agora + +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +namespace std { +template +struct hash> { + std::size_t operator()(const agora::agora_refptr& k) const { + return reinterpret_cast(k.get()); + } +}; +} // namespace std +#endif \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraH265Transcoder.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraH265Transcoder.h new file mode 100644 index 000000000..b2f5c5b31 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraH265Transcoder.h @@ -0,0 +1,178 @@ +// +// Agora Media SDK +// +// Copyright (c) 2022 Agora IO. All rights reserved. +// + +#pragma once + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" + +namespace agora{ +namespace rtc{ + +/** + * The result of IH265Transcoder interface invoking. +*/ +enum H265_TRANSCODE_RESULT { + /** + * -1: Unknown error. + */ + H265_TRANSCODE_RESULT_UNKNOWN = -1, + /** + * 0: The request of operation is successfully. + */ + H265_TRANSCODE_RESULT_SUCCESS = 0, + /** + * 1: This request is invalid. Possible reasons include incorrect parameters. + */ + H265_TRANSCODE_RESULT_REQUEST_INVALID = 1, + /** + * 2: Authentication failed, please check for correctness of token. + */ + H265_TRANSCODE_RESULT_UNAUTHORIZED = 2, + /** + * 3: The token is expired, please update token. + */ + H265_TRANSCODE_RESULT_TOKEN_EXPIRED = 3, + /** + * 4: No permission to access the interface. + */ + H265_TRANSCODE_RESULT_FORBIDDEN = 4, + /** + * 5: The url of request is not found. + */ + H265_TRANSCODE_RESULT_NOT_FOUND = 5, + /** + * 6: The request encountered a conflict, please try again. + */ + H265_TRANSCODE_RESULT_CONFLICTED = 6, + /** + * 7: Content type not supported. + */ + H265_TRANSCODE_RESULT_NOT_SUPPORTED = 7, + /** + * 8: The requests are too frequent. + */ + H265_TRANSCODE_RESULT_TOO_OFTEN = 8, + /** + * 9: Internal Server Error, you can try sending the request again. + */ + H265_TRANSCODE_RESULT_SERVER_INTERNAL_ERROR = 9, + /** + * 10: Service is unavailable. + */ + H265_TRANSCODE_RESULT_SERVICE_UNAVAILABLE = 10 +}; + +/** + * The IH265TranscoderObserver class +*/ +class IH265TranscoderObserver { + public: + virtual ~IH265TranscoderObserver() {}; + + /** + * Use to notify the result of invoking enableTranscode interface. + * @param result Result of invoking enableTranscode interface. There are some processing advice below of result. + * - H265_TRANSCODE_RESULT_REQUEST_INVALID: Channel or uid param have a mistake, you need to check them for correctness. + * - H265_TRANSCODE_RESULT_UNAUTHORIZED: Authentication failed, please check for correctness of token. + * - H265_TRANSCODE_RESULT_TOKEN_EXPIRED: The token has expired, you need to generate a new token. + * - H265_TRANSCODE_RESULT_FORBIDDEN: You need to contact agora staff to add the vid whitelist. + * - H265_TRANSCODE_RESULT_NOT_FOUND: Indicates that the network may be faulty. + * - H265_TRANSCODE_RESULT_TOO_OFTEN: Request is too often, please request again later. + * - H265_TRANSCODE_RESULT_SERVER_INTERNAL_ERROR: The service has an internal error. A request can be made again. + */ + virtual void onEnableTranscode(H265_TRANSCODE_RESULT result) = 0; + + /** + * Use to notify the result of invoking queryChannel interface. + * @param result Result of invoking queryChannel interface. There are some processing advice below of result. + * - H265_TRANSCODE_RESULT_UNAUTHORIZED: Authentication failed, please check for correctness of token. + * - H265_TRANSCODE_RESULT_TOKEN_EXPIRED: The token has expired, you need to generate a new token. + * - H265_TRANSCODE_RESULT_NOT_FOUND: Indicates that the network may be faulty or the channel param may be is empty. + * - H265_TRANSCODE_RESULT_TOO_OFTEN: Request is too often, please request again later. + * - H265_TRANSCODE_RESULT_SERVER_INTERNAL_ERROR: The service has an internal error. A request can be made again. + * + * @param originChannel Origin channel id + * @param transcodeChannel Transcode channel id + */ + virtual void onQueryChannel(H265_TRANSCODE_RESULT result, const char* originChannel, const char* transcodeChannel) = 0; + + /** Use to notify the result of invoking triggerTranscode interface. + * @param result Result of invoking triggerTranscode interface. There are some processing advice below of result. + * - H265_TRANSCODE_RESULT_UNAUTHORIZED: Authentication failed, please check for correctness of token. + * - H265_TRANSCODE_RESULT_TOKEN_EXPIRED: The token has expired, you need to generate a new token. + * - H265_TRANSCODE_RESULT_NOT_FOUND: Indicates that the network may be faulty or the channel param may be is empty. + * - H265_TRANSCODE_RESULT_CONFLICTED: The request of trigger transcode is conflicted, please try again. + * - H265_TRANSCODE_RESULT_TOO_OFTEN: Request is too often, please request again later + * - H265_TRANSCODE_RESULT_SERVER_INTERNAL_ERROR: The service has an internal error. A request can be made again. + * - H265_TRANSCODE_RESULT_SERVICE_UNAVAILABLE: May be the number of transcode service is over the limit. + */ + virtual void onTriggerTranscode(H265_TRANSCODE_RESULT result) = 0; + +}; + +/** + * The IH265Transcoder class +*/ +class IH265Transcoder : public RefCountInterface { + public: + /** + * Enable transcoding for a channel. + * @param token The token for authentication. + * @param channel The unique channel name for the AgoraRTC session in the string format. + * @param uid User ID. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int enableTranscode(const char *token, const char *channel, uid_t uid) = 0; + + /** + * Query the transcoded channel of a channel. + * @param token The token for authentication. + * @param channel The unique channel name for the AgoraRTC session in the string format. + * @param uid User ID. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int queryChannel(const char *token, const char *channel, uid_t uid) = 0; + + /** + * Trigger channel transcoding. + * @param token The token for authentication. + * @param channel The unique channel name for the AgoraRTC session in the string format. + * @param uid User ID. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int triggerTranscode(const char* token, const char* channel, uid_t uid) = 0; + /** + * Register a IH265TranscoderObserver object. + * @param observer IH265TranscoderObserver. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int registerTranscoderObserver(IH265TranscoderObserver *observer) = 0; + /** + * Unregister a IH265TranscoderObserver object. + * @param observer IH265TranscoderObserver. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int unregisterTranscoderObserver(IH265TranscoderObserver *observer) = 0; + + + protected: + virtual ~IH265Transcoder() {}; + +}; + +} // namespace rtc +} // namespace agora \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraLog.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraLog.h new file mode 100644 index 000000000..2fae3aa13 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraLog.h @@ -0,0 +1,98 @@ +// +// Agora Media SDK +// +// Copyright (c) 2015 Agora IO. All rights reserved. +// +#pragma once + +#include +#include + +#ifndef OPTIONAL_ENUM_CLASS +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#define OPTIONAL_ENUM_CLASS enum class +#else +#define OPTIONAL_ENUM_CLASS enum +#endif +#endif + +#ifndef OPTIONAL_LOG_LEVEL_SPECIFIER +#if __cplusplus >= 201103L || (defined(_MSC_VER) && _MSC_VER >= 1800) +#define OPTIONAL_LOG_LEVEL_SPECIFIER LOG_LEVEL:: +#else +#define OPTIONAL_LOG_LEVEL_SPECIFIER +#endif +#endif + +namespace agora { +namespace commons { + +/** + * Supported logging severities of SDK + */ +OPTIONAL_ENUM_CLASS LOG_LEVEL { + LOG_LEVEL_NONE = 0x0000, + LOG_LEVEL_INFO = 0x0001, + LOG_LEVEL_WARN = 0x0002, + LOG_LEVEL_ERROR = 0x0004, + LOG_LEVEL_FATAL = 0x0008, + LOG_LEVEL_API_CALL = 0x0010, +}; + +/* +The SDK uses ILogWriter class Write interface to write logs as application +The application inherits the methods Write() to implentation their own log writ + +Write has default implementation, it writes logs to files. +Application can use setLogFile() to change file location, see description of set +*/ +class ILogWriter { + public: + /** user defined log Write function + @param level log level + @param message log message content + @param length log message length + @return + - 0: success + - <0: failure + */ + virtual int32_t writeLog(LOG_LEVEL level, const char* message, uint16_t length) = 0; + + virtual ~ILogWriter() {} +}; + +enum LOG_FILTER_TYPE { + LOG_FILTER_OFF = 0, + LOG_FILTER_DEBUG = 0x080f, + LOG_FILTER_INFO = 0x000f, + LOG_FILTER_WARN = 0x000e, + LOG_FILTER_ERROR = 0x000c, + LOG_FILTER_CRITICAL = 0x0008, + LOG_FILTER_MASK = 0x80f, +}; + +const uint32_t MAX_LOG_SIZE = 20 * 1024 * 1024; // 20MB +const uint32_t MIN_LOG_SIZE = 128 * 1024; // 128KB +/** The default log size in kb + */ +const uint32_t DEFAULT_LOG_SIZE_IN_KB = 2048; + +/** Definition of LogConfiguration + */ +struct LogConfig { + /**The log file path, default is NULL for default log path + */ + const char* filePath; + /** The log file size, KB , set 2048KB to use default log size + */ + uint32_t fileSizeInKB; + /** The log level, set LOG_LEVEL_INFO to use default log level + */ + LOG_LEVEL level; + + LogConfig() : filePath(NULL), fileSizeInKB(DEFAULT_LOG_SIZE_IN_KB), level(OPTIONAL_LOG_LEVEL_SPECIFIER LOG_LEVEL_INFO) {} +}; +} // namespace commons +} // namespace agora + +#undef OPTIONAL_LOG_LEVEL_SPECIFIER diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaComponentFactory.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaComponentFactory.h new file mode 100644 index 000000000..28de53fbd --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaComponentFactory.h @@ -0,0 +1,41 @@ +// +// Agora SDK +// +// Copyright (c) 2021 Agora.io. All rights reserved. +// +#pragma once // NOLINT(build/header_guard) + +#include "AgoraBase.h" +#include "AgoraRefPtr.h" + +namespace agora { +namespace rtc { + +class IMediaPlayer; + +class IMediaComponentFactory { +public: + /** This method creates media player. + */ + virtual agora_refptr createMediaPlayer( + agora::media::base::MEDIA_PLAYER_SOURCE_TYPE type = agora::media::base::MEDIA_PLAYER_SOURCE_DEFAULT) = 0; + +protected: + virtual ~IMediaComponentFactory() {} +}; + +} //namespace rtc +} // namespace agora + +/** \addtogroup createMediaComponentFactory + @{ + */ +/** + * Creates an \ref agora::rtc::IMediaComponentFactory "IMediaComponentFactory" object and returns the pointer. + * + * @return + * - The pointer to \ref agora::rtc::IMediaComponentFactory "IMediaComponentFactory": Success. + * - A null pointer: Failure. + */ +AGORA_API agora::rtc::IMediaComponentFactory* AGORA_CALL createAgoraMediaComponentFactory(); +/** @} */ diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaEngine.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaEngine.h new file mode 100644 index 000000000..e57404a22 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaEngine.h @@ -0,0 +1,270 @@ +// +// Agora Media SDK +// +// Copyright (c) 2015 Agora IO. All rights reserved. +// +#pragma once + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "AgoraRefPtr.h" + +namespace agora { +namespace media { + +/** dual-mono music output mode + */ +enum AUDIO_MIXING_DUAL_MONO_MODE { + /* 0: Original mode */ + AUDIO_MIXING_DUAL_MONO_AUTO = 0, + /* 1: Left channel mode */ + AUDIO_MIXING_DUAL_MONO_L = 1, + /* 2: Right channel mode */ + AUDIO_MIXING_DUAL_MONO_R = 2, + /* 3: Mixed channel mode */ + AUDIO_MIXING_DUAL_MONO_MIX = 3 +}; + + +/** + * The IMediaEngine class. + */ +class IMediaEngine { + public: + /** + * Registers an audio frame observer object. + * + * @note + * Ensure that you call this method before \ref IRtcEngine::joinChannel "joinChannel". + * + * @param observer A pointer to the audio frame observer object: IAudioFrameObserver, + * nullptr means unregistering observer instead. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerAudioFrameObserver(IAudioFrameObserver* observer) = 0; + /** + * Registers a video frame observer object. + * + * @note + * - Ensure that you call this method before joining the channel. + * - If you register an observer for video raw video data, you cannot register an IVideoEncodedFrameObserver + * object. + * + * @param observer A pointer to the video frame observer: IVideoFrameObserver. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerVideoFrameObserver(IVideoFrameObserver* observer) = 0; + /** + * Registers a receiver object for the encoded video image. + * + * @note + * - Ensure that you call this method before joining the channel. + * + * @param observer A pointer to the observer of the encoded video image: \ref IVideoEncodedFrameObserver + * "IVideoEncodedFrameObserver". + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerVideoEncodedFrameObserver(IVideoEncodedFrameObserver* observer) = 0; + /** + * Pushes the external audio data to the app. + * + * @param frame The audio buffer data. + * @param trackId The audio track ID. + * @return + * - 0: Success. + * - < 0: Failure. + */ + + virtual int pushAudioFrame(IAudioFrameObserverBase::AudioFrame* frame, rtc::track_id_t trackId = 0) = 0; + + /** + * Pulls the remote audio data. + * + * After a successful method call, the app pulls the decoded and mixed audio data for playback. + * + * The difference between this method and the \ref onPlaybackAudioFrame "onPlaybackAudioFrame" is as follows: + * - `onPlaybackAudioFrame`: The SDK sends the audio data to the app once every 10 ms. Any delay in processing + * the audio frames may result in audio jitter. + * - `pullAudioFrame`: The app pulls the remote audio data. After setting the audio data parameters, the + * SDK adjusts the frame buffer and avoids problems caused by jitter in the external audio playback. + * + * @param frame The pointer to the audio frame: AudioFrame. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int pullAudioFrame(IAudioFrameObserverBase::AudioFrame* frame) = 0; + + /** + * Sets the external video source. + * + * Once the external video source is enabled, the SDK prepares to accept the external video frame. + * + * @param enabled Determines whether to enable the external video source. + * - true: Enable the external video source. Once set, the SDK creates the external source and prepares + * video data from `pushVideoFrame` or `pushEncodedVideoImage`. + * - false: Disable the external video source. + * @param useTexture Determines whether to use textured video data. + * - true: Use texture, which is not supported now. + * - False: Do not use texture. + * @param sourceType Determines the type of external video source frame. + * - ENCODED_VIDEO_FRAME: The external video source is encoded. + * - VIDEO_FRAME: The external video source is not encoded. + * @param encodedVideoOption Video encoded track option, which is only used for ENCODED_VIDEO_FRAME. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExternalVideoSource( + bool enabled, bool useTexture, EXTERNAL_VIDEO_SOURCE_TYPE sourceType = VIDEO_FRAME, + rtc::SenderOptions encodedVideoOption = rtc::SenderOptions()) = 0; + + /** + * Sets the external audio source. + * + * @note + * Ensure that you call this method before joining the channel. + * + * @deprecated This method is deprecated. Use createCustomAudioTrack(rtc::AUDIO_TRACK_TYPE trackType, const rtc::AudioTrackConfig& config) instead. + * + * @param enabled Determines whether to enable the external audio source: + * - true: Enable the external audio source. + * - false: (default) Disable the external audio source. + * @param sampleRate The Sample rate (Hz) of the external audio source, which can set be as + * 8000, 16000, 32000, 44100, or 48000. + * @param channels The number of channels of the external audio source, which can be set as 1 or 2: + * - 1: Mono. + * - 2: Stereo. + * @param localPlayback Enable/Disables the local playback of external audio track: + * - true: Enable local playback + * - false: (Default) Do not enable local playback + * @param publish Determines whether to publish the external audio track: + * - true: (Default) Publish the external audio track. + * - false: Don`t publish the external audio track. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExternalAudioSource(bool enabled, int sampleRate, int channels, bool localPlayback = false, bool publish = true) __deprecated = 0; + + /** + * Create a custom audio track and get the audio track id. + * + * @note Ensure that you call this method before calling `joinChannel`. + * + * @param trackType The type of custom audio track + * See AUDIO_TRACK_TYPE. + * + * @param config The config of custom audio track + * See AudioTrackConfig. + * + * @return + * - If the call is successful, SDK returns audio track id. + * - If the call fails, SDK returns 0xffffffff. + */ + virtual rtc::track_id_t createCustomAudioTrack(rtc::AUDIO_TRACK_TYPE trackType, const rtc::AudioTrackConfig& config) = 0; + + /** + * Destroy custom audio track by trackId + * + * @param trackId The custom audio track id. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int destroyCustomAudioTrack(rtc::track_id_t trackId) = 0; + + /** + * Sets the external audio sink. + * + * This method applies to scenarios where you want to use external audio + * data for playback. After calling the \ref IRtcEngine::initialize "initialize" + * method and pass value of false in the `enableAudioDevice` member in the RtcEngineContext struct, you can call + * the \ref agora::media::IMediaEngine::pullAudioFrame "pullAudioFrame" method to pull the remote audio data, process + * it, and play it with the audio effects that you want. + * + * @note + * Once you call the \ref IRtcEngine::initialize "initialize" method and pass value of false in the `enableAudioDevice` + * member in the RtcEngineContext struct, the app will not retrieve any audio data from the + * \ref agora::media::IAudioFrameObserver::onPlaybackAudioFrame "onPlaybackAudioFrame" callback. + * + * @param enabled Sets whether or not to the external audio sink + * - true: Enables the external audio sink. + * - false: Disables the external audio sink. + * @param sampleRate Sets the sample rate (Hz) of the external audio sink, which can be set as 16000, 32000, 44100 or 48000. + * @param channels Sets the number of audio channels of the external + * audio sink: + * - 1: Mono. + * - 2: Stereo. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExternalAudioSink(bool enabled, int sampleRate, int channels) = 0; + + /** + * Sets the external audio track. + * + * @note + * Ensure that you call this method before joining the channel. + * + * @param trackId The custom audio track id. + * @param enabled Enable/Disables the local playback of external audio track: + * - true: Enable local playback + * - false: Do not enable local playback + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableCustomAudioLocalPlayback(rtc::track_id_t trackId, bool enabled) = 0; + + /** + * Pushes the external video frame to the app. + * + * @param frame The external video frame: ExternalVideoFrame. + * @param videoTrackId The id of the video track. + * - 0: Success. + * - < 0: Failure. + */ + virtual int pushVideoFrame(base::ExternalVideoFrame* frame, unsigned int videoTrackId = 0) = 0; + /** + * Pushes the encoded video image to the app. + * @param imageBuffer A pointer to the video image. + * @param length The data length. + * @param videoEncodedFrameInfo The reference to the information of the encoded video frame: + * \ref agora::rtc::EncodedVideoFrameInfo "EncodedVideoFrameInfo". + * @param videoTrackId The id of the video track. + * - 0: Success. + * - < 0: Failure. + */ + virtual int pushEncodedVideoImage(const unsigned char* imageBuffer, size_t length, + const agora::rtc::EncodedVideoFrameInfo& videoEncodedFrameInfo, + unsigned int videoTrackId = 0) = 0; + /** + * @hide For internal usage only + */ + virtual int addVideoFrameRenderer(IVideoFrameObserver *renderer) = 0; + + /** + * @hide For internal usage only + */ + virtual int removeVideoFrameRenderer(IVideoFrameObserver *renderer) = 0; + + virtual void release() = 0; + + protected: + virtual ~IMediaEngine() {} +}; + +} // namespace media + +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayer.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayer.h new file mode 100644 index 000000000..bd3c7597c --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayer.h @@ -0,0 +1,644 @@ +// +// Agora SDK +// +// Copyright (c) 2020 Agora.io. All rights reserved. +// +#pragma once // NOLINT(build/header_guard) + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "AgoraMediaPlayerTypes.h" +#include "AgoraRefPtr.h" + +namespace agora { +namespace base { +class IAgoraService; +} +namespace rtc { + +class ILocalAudioTrack; +class ILocalVideoTrack; +class IMediaPlayerSourceObserver; +class IMediaPlayerCustomDataProvider; + +/** + * The IMediaPlayerEntity class provides access to a media player entity. If yout want to playout + * multiple media sources simultaneously, create multiple media player source objects. + */ +class IMediaPlayer : public RefCountInterface { +protected: + virtual ~IMediaPlayer() {} + +public: + virtual int initialize(base::IAgoraService* agora_service) = 0; + + /** + * Get unique media player id of the media player entity. + * @return + * - >= 0: The source id of this media player entity. + * - < 0: Failure. + */ + virtual int getMediaPlayerId() const = 0; + + /** + * Opens a media file with a specified URL. + * @param url The URL of the media file that you want to play. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int open(const char* url, int64_t startPos) = 0; + + /** + * @deprecated + * @brief Open media file or stream with custom soucrce. + * @param startPos Set the starting position for playback, in seconds + * @param observer dataProvider object + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int openWithCustomSource(int64_t startPos, media::base::IMediaPlayerCustomDataProvider* provider) __deprecated = 0; + + /** + * @brief Open a media file with a media file source. + * @param source Media file source that you want to play, see `MediaSource` + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int openWithMediaSource(const media::base::MediaSource &source) = 0; + + /** + * Plays the media file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int play() = 0; + + /** + * Pauses playing the media file. + */ + virtual int pause() = 0; + + /** + * Stops playing the current media file. + */ + virtual int stop() = 0; + + /** + * Resumes playing the media file. + */ + virtual int resume() = 0; + + /** + * Sets the current playback position of the media file. + * @param newPos The new playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int seek(int64_t newPos) = 0; + + /** Sets the pitch of the current media file. + * @param pitch Sets the pitch of the local music file by chromatic scale. The default value is 0, + * which means keeping the original pitch. The value ranges from -12 to 12, and the pitch value between + * consecutive values is a chromatic value. The greater the absolute value of this parameter, the + * higher or lower the pitch of the local music file. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioPitch(int pitch) = 0; + + /** + * Gets the duration of the media file. + * @param duration A reference to the duration of the media file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getDuration(int64_t& duration) = 0; + + /** + * Gets the current playback position of the media file. + * @param currentPosition A reference to the current playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getPlayPosition(int64_t& pos) = 0; + + virtual int getStreamCount(int64_t& count) = 0; + + virtual int getStreamInfo(int64_t index, media::base::PlayerStreamInfo* info) = 0; + + /** + * Sets whether to loop the media file for playback. + * @param loopCount the number of times looping the media file. + * - 0: Play the audio effect once. + * - 1: Play the audio effect twice. + * - -1: Play the audio effect in a loop indefinitely, until stopEffect() or stop() is called. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLoopCount(int loopCount) = 0; + + /** + * Change playback speed + * @param speed the value of playback speed ref [50-400] + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlaybackSpeed(int speed) = 0; + + /** + * Slect playback audio track of the media file + * @param index the index of the audio track in media file + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int selectAudioTrack(int index) = 0; + + /** + * Selects multi audio track of the media file for playback or publish to channel. + * @param playoutTrackIndex The index of the audio track in media file for local playback. + * @param publishTrackIndex The index of the audio track in the media file published to the remote. + * + * @note + * You can obtain the streamIndex of the audio track by calling getStreamInfo.. + * If you want to use selectMultiAudioTrack, you need to open the media file with openWithMediaSource and set enableMultiAudioTrack to true. + * + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + * - -2: Invalid argument. Argument must be greater than or equal to zero. + * - -8: Invalid State.You must open the media file with openWithMediaSource and set enableMultiAudioTrack to true + */ + virtual int selectMultiAudioTrack(int playoutTrackIndex, int publishTrackIndex) = 0; + + /** + * change player option before play a file + * @param key the key of the option param + * @param value the value of option param + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlayerOption(const char* key, int value) = 0; + + /** + * change player option before play a file + * @param key the key of the option param + * @param value the value of option param + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlayerOption(const char* key, const char* value) = 0; + /** + * take screenshot while playing video + * @param filename the filename of screenshot file + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int takeScreenshot(const char* filename) = 0; + + /** + * select internal subtitles in video + * @param index the index of the internal subtitles + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int selectInternalSubtitle(int index) = 0; + + /** + * set an external subtitle for video + * @param url The URL of the subtitle file that you want to load. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExternalSubtitle(const char* url) = 0; + + virtual media::base::MEDIA_PLAYER_STATE getState() = 0; + + /** + * @brief Turn mute on or off + * + * @param muted Whether to mute on + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int mute(bool muted) = 0; + + /** + * @brief Get mute state + * + * @param[out] muted Whether is mute on + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int getMute(bool& muted) = 0; + + /** + * @brief Adjust playback volume + * + * @param volume The volume value to be adjusted + * The volume can be adjusted from 0 to 400: + * 0: mute; + * 100: original volume; + * 400: Up to 4 times the original volume (with built-in overflow protection). + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int adjustPlayoutVolume(int volume) = 0; + + /** + * @brief Get the current playback volume + * + * @param[out] volume + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int getPlayoutVolume(int& volume) = 0; + + /** + * @brief adjust publish signal volume + * + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int adjustPublishSignalVolume(int volume) = 0; + + /** + * @brief get publish signal volume + * + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int getPublishSignalVolume(int& volume) = 0; + + /** + * @brief Set video rendering view + * + * @param view view object, windows platform is HWND + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int setView(media::base::view_t view) = 0; + + /** + * @brief Set video display mode + * + * @param renderMode Video display mode + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int setRenderMode(media::base::RENDER_MODE_TYPE renderMode) = 0; + + /** + * Registers a media player source observer. + * + * Once the media player source observer is registered, you can use the observer to monitor the state change of the media player. + * @param observer The pointer to the IMediaPlayerSourceObserver object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerPlayerSourceObserver(IMediaPlayerSourceObserver* observer) = 0; + + /** + * Releases the media player source observer. + * @param observer The pointer to the IMediaPlayerSourceObserver object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unregisterPlayerSourceObserver(IMediaPlayerSourceObserver* observer) = 0; + + /** + * Register the audio frame observer. + * + * @param observer The pointer to the IAudioFrameObserver object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; + + /** + * Registers an audio observer. + * + * @param observer The audio observer, reporting the reception of each audio + * frame. See + * \ref media::IAudioPcmFrameSink "IAudioFrameObserver" for + * details. + * @param mode Use mode of the audio frame. See #RAW_AUDIO_FRAME_OP_MODE_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer, + RAW_AUDIO_FRAME_OP_MODE_TYPE mode) = 0; + + /** + * Releases the audio frame observer. + * @param observer The pointer to the IAudioFrameObserver object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unregisterAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; + + /** + * @brief Register the player video observer + * + * @param observer observer object + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int registerVideoFrameObserver(media::base::IVideoFrameObserver* observer) = 0; + + /** + * @brief UnRegister the player video observer + * + * @param observer observer object + * @return int < 0 on behalf of an error, the value corresponds to one of MEDIA_PLAYER_REASON + */ + virtual int unregisterVideoFrameObserver(agora::media::base::IVideoFrameObserver* observer) = 0; + + /** + * Registers the audio frame spectrum observer. + * + * @param observer The pointer to the {@link media::base::IAudioSpectrumObserver IAudioSpectrumObserver} object. + * @param intervalInMS Sets the time interval(ms) between two consecutive audio spectrum callback. + * The default value is 100. This param should be larger than 10. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerMediaPlayerAudioSpectrumObserver(media::IAudioSpectrumObserver* observer, int intervalInMS) = 0; + + /** + * Releases the audio frame spectrum observer. + * @param observer The pointer to the {@link media::base::IAudioSpectrumObserver IAudioSpectrumObserver} object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unregisterMediaPlayerAudioSpectrumObserver(media::IAudioSpectrumObserver* observer) = 0; + + /** + * @brief Set dual-mono output mode of the music file. + * + * @param mode dual mono mode. See #agora::media::AUDIO_DUAL_MONO_MODE + */ + virtual int setAudioDualMonoMode(agora::media::base::AUDIO_DUAL_MONO_MODE mode) = 0; + + /** + * get sdk version and build number of player SDK. + * @return String of the SDK version. + * + * @deprecated This method is deprecated. + */ + virtual const char* getPlayerSdkVersion() = 0; + + /** + * Get the current play src. + * @return + * - current play src of raw bytes. + */ + virtual const char* getPlaySrc() = 0; + + + /** + * Open the Agora CDN media source. + * @param src The src of the media file that you want to play. + * @param startPos The playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int openWithAgoraCDNSrc(const char* src, int64_t startPos) = 0; + + /** + * Gets the number of Agora CDN lines. + * @return + * - > 0: number of CDN. + * - <= 0: Failure. + */ + virtual int getAgoraCDNLineCount() = 0; + + /** + * Switch Agora CDN lines. + * @param index Specific CDN line index. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchAgoraCDNLineByIndex(int index) = 0; + + /** + * Gets the line of the current CDN. + * @return + * - >= 0: Specific line. + * - < 0: Failure. + */ + virtual int getCurrentAgoraCDNIndex() = 0; + + /** + * Enable automatic CDN line switching. + * @param enable Whether enable. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAutoSwitchAgoraCDN(bool enable) = 0; + + /** + * Update the CDN source token and timestamp. + * @param token token. + * @param ts ts. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int renewAgoraCDNSrcToken(const char* token, int64_t ts) = 0; + + /** + * Switch the CDN source when open a media through "openWithAgoraCDNSrc" API + * @param src Specific src. + * @param syncPts Live streaming must be set to false. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchAgoraCDNSrc(const char* src, bool syncPts = false) = 0; + + /** + * Switch the media source when open a media through "open" API + * @param src Specific src. + * @param syncPts Live streaming must be set to false. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchSrc(const char* src, bool syncPts = true) = 0; + + /** + * Preload a media source + * @param src Specific src. + * @param startPos The starting position (ms) for playback. Default value is 0. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int preloadSrc(const char* src, int64_t startPos) = 0; + + /** + * Play a pre-loaded media source + * @param src Specific src. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int playPreloadedSrc(const char* src) = 0; + + /** + * Unload a preloaded media source + * @param src Specific src. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unloadSrc(const char* src) = 0; + + /** + * Set spatial audio params for the music file. It can be called after the media player + * was created. + * + * @param params See #agora::SpatialAudioParams. If it's + * not set, then the spatial audio will be disabled; or it will be enabled. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSpatialAudioParams(const SpatialAudioParams& params) = 0; + + /** + * Set sound position params for the music file. It can be called after the media player + * was created. + * + *@param pan The sound position of the music file. The value ranges from -1.0 to 1.0: + *- 0.0: the music sound comes from the front. + *- -1.0: the music sound comes from the left. + *- 1.0: the music sound comes from the right. + *@param gain Gain of the music. The value ranges from 0.0 to 100.0. The default value is 100.0 (the original gain of the music). The smaller the value, the less the gain. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSoundPositionParams(float pan, float gain) = 0; + +}; + +/** + * This class is used to set and manage the player cache, implemented in the + * form of a singleton, independent of the player. + */ +class IMediaPlayerCacheManager { +public: + /** + * Delete the longest used cache file in order to release some of the cache file disk usage. + * (usually used when the cache quota notification is received) + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int removeAllCaches() = 0; + /** + * Remove the latest media resource cache file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int removeOldCache() = 0; + /** + * Remove the cache file by uri, setting by MediaSource. + * @param uri URI锛宨dentify the uniqueness of the property, Set from `MeidaSource` + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int removeCacheByUri(const char *uri) = 0; + /** + * Set cache file path that files will be saved to. + * @param path file path. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setCacheDir(const char *path) = 0; + /** + * Set the maximum number of cached files. + * @param count maximum number of cached files. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setMaxCacheFileCount(int count) = 0; + /** + * Set the maximum size of cache file disk usage. + * @param cacheSize total size of the largest cache file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setMaxCacheFileSize(int64_t cacheSize) = 0; + /** + * Whether to automatically delete old cache files when the cache file usage reaches the limit. + * @param enable enable the player to automatically clear the cache. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAutoRemoveCache(bool enable) = 0; + /** + * Get the cache directory. + * @param path cache path, recieve a pointer to be copied to. + * @param length the length to be copied. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getCacheDir(char* path, int length) = 0; + /** + * Get the maximum number of cached files. + * @return + * > 0: file count. + * - < 0: Failure. + */ + virtual int getMaxCacheFileCount() = 0; + /** + * Get the total size of the largest cache file + * @return + * > 0: file size. + * - < 0: Failure. + */ + virtual int64_t getMaxCacheFileSize() = 0; + /** + * Get the number of all cache files. + * @return + * > 0: file count. + * - < 0: Failure. + */ + virtual int getCacheFileCount() = 0; + + virtual ~IMediaPlayerCacheManager(){}; +}; + +} //namespace rtc +} // namespace agora + +AGORA_API agora::rtc::IMediaPlayerCacheManager* AGORA_CALL getMediaPlayerCacheManager(); diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayerSource.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayerSource.h new file mode 100644 index 000000000..00be02233 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaPlayerSource.h @@ -0,0 +1,504 @@ +// +// Agora SDK +// +// Copyright (c) 2018 Agora.io. All rights reserved. +// +#pragma once // NOLINT(build/header_guard) + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "AgoraMediaPlayerTypes.h" +#include "AgoraRefPtr.h" + +namespace agora { +namespace rtc { + +class IMediaPlayerSourceObserver; + +/** + * The IMediaPlayerSource class provides access to a media player source. To playout multiple media sources simultaneously, + * create multiple media player source objects. + */ +class IMediaPlayerSource : public RefCountInterface { +protected: + virtual ~IMediaPlayerSource() {} + +public: + + /** + * Gets the unique source ID of the media player source. + * @return + * - >=0: The source ID of this media player source. + * - < 0: Failure. + */ + virtual int getSourceId() const = 0; + + /** + * Opens a media file with a specified URL. + * @param url The path of the media file. Both the local path and online path are supported. + * @param startPos The starting position (ms) for playback. Default value is 0. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int open(const char* url, int64_t startPos) = 0; + + /** + * @deprecated + * @brief Open media file or stream with custom soucrce. + * @param startPos Set the starting position for playback, in seconds + * @param observer dataProvider object + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int openWithCustomSource(int64_t startPos, media::base::IMediaPlayerCustomDataProvider* provider) __deprecated = 0; + + /** + * Opens a media file with a media file source. + * @param source Media file source that you want to play, see `MediaSource` + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int openWithMediaSource(const media::base::MediaSource &source) = 0; + + /** + * Plays the media file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int play() = 0; + + /** + * Pauses the playback. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int pause() = 0; + + /** + * Stops the playback. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stop() = 0; + + /** + * Resumes the playback. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int resume() = 0; + + /** + * Sets the playback position of the media file. + * @param newPos The new playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int seek(int64_t newPos) = 0; + + /** + * Gets the duration of the media file. + * @param [out] duration A reference to the duration of the media file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getDuration(int64_t& duration) = 0; + + /** + * Gets the current playback position of the media file. + * @param [out] pos A reference to the current playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int getPlayPosition(int64_t& pos) = 0; + + /** + * Gets the number of the media streams in the media source. + * @param [out] count The number of the media streams in the media source. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int getStreamCount(int64_t& count) = 0; + + /** + * Gets the detailed information of a media stream. + * @param index The index of the media stream. + * @param [out] info The detailed information of the media stream. See \ref media::base::PlayerStreamInfo "PlayerStreamInfo" for details. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int getStreamInfo(int64_t index, media::base::PlayerStreamInfo* info) = 0; + + /** + * Sets whether to loop the media file for playback. + * @param loopCount The number of times of looping the media file. + * - 0: Play the media file once. + * - 1: Play the media file twice. + * - -1: Play the media file in a loop indefinitely, until {@link stop} is called. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int setLoopCount(int64_t loopCount) = 0; + + /** + * Changes the playback speed. + * @param speed The playback speed ref [50-400]. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int setPlaybackSpeed(int speed) = 0; + + /** + * Selects an audio track of the media file for playback. + * @param index The index of the audio track in media file. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int selectAudioTrack(int64_t index) = 0; + + /** + * Selects multi audio track of the media file for playback or publish to channel. + * @param playoutTrackIndex The index of the audio track in media file for local playback. + * @param publishTrackIndex The index of the audio track in the media file published to the remote. + * + * @note + * You can obtain the streamIndex of the audio track by calling getStreamInfo.. + * If you want to use selectMultiAudioTrack, you need to open the media file with openWithMediaSource and set enableMultiAudioTrack to true. + * + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + * - -2: Invalid argument. Argument must be greater than or equal to zero. + * - -8: Invalid State.You must open the media file with openWithMediaSource and set enableMultiAudioTrack to true + */ + virtual int selectMultiAudioTrack(int playoutTrackIndex, int publishTrackIndex) = 0; + + /** + * Changes the player option before playing a file. + * @param key The key of the option paramemter. + * @param value The value of option parameter. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int setPlayerOption(const char* key, int64_t value) = 0; + + /** + * Changes the player option before playing a file. + * @param key The key of the option paramemter. + * @param value The value of option parameter. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int setPlayerOption(const char* key, const char* value) = 0; + + /** + * Takes a screenshot when playing a video file. + * @param filename The filename of the screenshot file. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int takeScreenshot(const char* filename) = 0; + + /** + * Selects internal subtitles for a video file. + * @param index The index of the internal subtitles. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int selectInternalSubtitle(int64_t index) = 0; + + /** + * Sets an external subtitle file for a video file. + * @param url The URL of the subtitle file. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int setExternalSubtitle(const char* url) = 0; + + /** + * Gets the playback state. + * @return The current playback state. See {@link media::base::MEDIA_PLAYER_STATE MEDIA_PLAYER_STATE} for details. + */ + virtual media::base::MEDIA_PLAYER_STATE getState() = 0; + + /** + * Registers a media player source observer. + * + * Once the media player source observer is registered, you can use the observer to monitor the state change of the media player. + * @param observer The pointer to the IMediaPlayerSourceObserver object. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int registerPlayerSourceObserver(IMediaPlayerSourceObserver* observer) = 0; + + /** + * Releases the media player source observer. + * @param observer The pointer to the IMediaPlayerSourceObserver object. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int unregisterPlayerSourceObserver(IMediaPlayerSourceObserver* observer) = 0; + + /** + * Registers the audio frame observer. + * + * @param observer The pointer to the {@link media::IAudioPcmFrameSink observer} object. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int registerAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; + + /** + * Releases the audio frame observer. + * @param observer The pointer to the {@link media::IAudioPcmFrameSink observer} object. + * @return + * - 0: Success. + * - < 0: Failure. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual int unregisterAudioFrameObserver(media::IAudioPcmFrameSink* observer) = 0; + + /** + * Open the Agora CDN media source. + * @param src The src of the media file that you want to play. + * @param startPos The playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int openWithAgoraCDNSrc(const char* src, int64_t startPos) = 0; + + /** + * Gets the number of Agora CDN lines. + * @return + * - > 0: number of CDN. + * - <= 0: Failure. + */ + virtual int getAgoraCDNLineCount() = 0; + + + /** + * Switch Agora CDN lines. + * @param index Specific CDN line index. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchAgoraCDNLineByIndex(int index) = 0; + + /** + * Gets the line of the current CDN. + * @return + * - >= 0: Specific line. + * - < 0: Failure. + */ + virtual int getCurrentAgoraCDNIndex() = 0; + + /** + * Enable automatic CDN line switching. + * @param enable Whether enable. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAutoSwitchAgoraCDN(bool enable) = 0; + + /** + * Update the CDN source token and timestamp. + * @param token token. + * @param ts ts. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int renewAgoraCDNSrcToken(const char* token, int64_t ts) = 0; + + /** + * Switch the CDN source when open a media through "openWithAgoraCDNSrc" API + * @param src Specific src. + * @param syncPts Live streaming must be set to false. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchAgoraCDNSrc(const char* src, bool syncPts = false) = 0; + + /** + * Switch the media source when open a media through "open" API + * @param src Specific src. + * @param syncPts Live streaming must be set to false. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchSrc(const char* src, bool syncPts) = 0; + + /** + * Preload a media source + * @param src Specific src. + * @param startPos The starting position (ms) for playback. Default value is 0. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int preloadSrc(const char* src, int64_t startPos) = 0; + + /** + * Unload a preloaded media source + * @param src Specific src. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unloadSrc(const char* src) = 0; + + /** + * Play a pre-loaded media source + * @param src Specific src. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int playPreloadedSrc(const char* src) = 0; + +}; + +/** + * This class class reports runtime events to the applications. + */ +class IMediaPlayerSourceObserver { + public: + virtual ~IMediaPlayerSourceObserver() {} + + /** + * @brief Reports the playback state change. + * + * When the state of the playback changes, the SDK triggers this callback to report the new playback state and the reason or error for the change. + * @param state The new playback state after change. See {@link media::base::MEDIA_PLAYER_STATE MEDIA_PLAYER_STATE}. + * @param reason The player's error code. See {@link media::base::MEDIA_PLAYER_REASON MEDIA_PLAYER_REASON}. + */ + virtual void onPlayerSourceStateChanged(media::base::MEDIA_PLAYER_STATE state, + media::base::MEDIA_PLAYER_REASON reason) = 0; + + /** + * @brief Reports current playback progress. + * + * The callback occurs once every one second during the playback and reports the current playback progress. + * @param positionMs Current playback progress (milisecond). + * @param timestampMs Current NTP(Network Time Protocol) time (milisecond). + */ + virtual void onPositionChanged(int64_t positionMs, int64_t timestampMs) = 0; + + /** + * @brief Reports the playback event. + * + * - After calling the `seek` method, the SDK triggers the callback to report the results of the seek operation. + * - After calling the `selectAudioTrack` method, the SDK triggers the callback to report that the audio track changes. + * + * @param eventCode The playback event. See {@link media::base::MEDIA_PLAYER_EVENT MEDIA_PLAYER_EVENT}. + * @param elapsedTime The playback elapsed time. + * @param message The playback message. + */ + virtual void onPlayerEvent(media::base::MEDIA_PLAYER_EVENT eventCode, int64_t elapsedTime, const char* message) = 0; + + /** + * @brief Occurs when the metadata is received. + * + * The callback occurs when the player receives the media metadata and reports the detailed information of the media metadata. + * @param data The detailed data of the media metadata. + * @param length The data length (bytes). + */ + virtual void onMetaData(const void* data, int length) = 0; + + + /** + * @brief Triggered when play buffer updated, once every 1 second + * + * @param int cached buffer during playing, in milliseconds + */ + virtual void onPlayBufferUpdated(int64_t playCachedBuffer) = 0; + + + /** + * @brief Triggered when the player preloadSrc + * + * @param event + */ + virtual void onPreloadEvent(const char* src, media::base::PLAYER_PRELOAD_EVENT event) = 0; + + /** + * @brief Occurs when one playback of the media file is completed. + */ + virtual void onCompleted() = 0; + + /** + * @brief AgoraCDN Token has expired and needs to be set up with renewAgoraCDNSrcToken(const char* src). + */ + virtual void onAgoraCDNTokenWillExpire() = 0; + + /** + * @brief Reports current playback source bitrate changed. + * @brief Reports current playback source info changed. + * + * @param from Streaming media information before the change. + * @param to Streaming media information after the change. + */ + virtual void onPlayerSrcInfoChanged(const media::base::SrcInfo& from, const media::base::SrcInfo& to) = 0; + + /** + * @brief Triggered when media player information updated. + * + * @param info Include information of media player. + */ + virtual void onPlayerInfoUpdated(const media::base::PlayerUpdatedInfo& info) = 0; + + /** + * @brief Triggered every 1 second, reports the statistics of the files being cached. + * + * @param stats Cached file statistics. + */ + virtual void onPlayerCacheStats(const media::base::CacheStatistics& stats) { + (void)stats; + } + + /** + * @brief Triggered every 1 second, reports the statistics of the media stream being played. + * + * @param stats The statistics of the media stream. + */ + virtual void onPlayerPlaybackStats(const media::base::PlayerPlaybackStats& stats) { + (void)stats; + } + + /** + * @brief Triggered every 200 millisecond ,update player current volume range [0,255] + * + * @param volume volume of current player. + */ + virtual void onAudioVolumeIndication(int volume) = 0; +}; + +} //namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaRecorder.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaRecorder.h new file mode 100644 index 000000000..17375607c --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaRecorder.h @@ -0,0 +1,90 @@ +// +// Agora SDK +// +// Copyright (c) 2022 Agora.io. All rights reserved. +// +#pragma once // NOLINT(build/header_guard) + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "IAgoraRtcEngineEx.h" + +namespace agora { +namespace rtc { + +class IMediaRecorder : public RefCountInterface { + protected: + virtual ~IMediaRecorder() {} + + public: + /** + * Registers the IMediaRecorderObserver object. + * + * @since v4.0.0 + * + * @note Call this method before the startRecording method. + * + * @param callback The callbacks for recording audio and video streams. See \ref IMediaRecorderObserver. + * + * @return + * - 0(ERR_OK): Success. + * - < 0: Failure: + */ + virtual int setMediaRecorderObserver(media::IMediaRecorderObserver* callback) = 0; + /** + * Starts recording the local or remote audio and video. + * + * @since v4.0.0 + * + * After successfully calling \ref IRtcEngine::createMediaRecorder "createMediaRecorder" to get the media recorder object + * , you can call this method to enable the recording of the local audio and video. + * + * This method can record the following content: + * - The audio captured by the local microphone and encoded in AAC format. + * - The video captured by the local camera and encoded by the SDK. + * - The audio received from remote users and encoded in AAC format. + * - The video received from remote users. + * + * The SDK can generate a recording file only when it detects the recordable audio and video streams; when there are + * no audio and video streams to be recorded or the audio and video streams are interrupted for more than five + * seconds, the SDK stops recording and triggers the + * \ref IMediaRecorderObserver::onRecorderStateChanged "onRecorderStateChanged" (RECORDER_STATE_ERROR, RECORDER_ERROR_NO_STREAM) + * callback. + * + * @note Call this method after joining the channel. + * + * @param config The recording configurations. See MediaRecorderConfiguration. + * + * @return + * - 0(ERR_OK): Success. + * - < 0: Failure: + * - `-1(ERR_FAILED)`: IRtcEngine does not support the request because the remote user did not subscribe to the target channel or the media streams published by the local user during remote recording. + * - `-2(ERR_INVALID_ARGUMENT)`: The parameter is invalid. Ensure the following: + * - The specified path of the recording file exists and is writable. + * - The specified format of the recording file is supported. + * - The maximum recording duration is correctly set. + * - During remote recording, ensure the user whose media streams you want record did join the channel. + * - `-4(ERR_NOT_SUPPORTED)`: IRtcEngine does not support the request due to one of the following reasons: + * - The recording is ongoing. + * - The recording stops because an error occurs. + * - No \ref IMediaRecorderObserver object is registered. + */ + virtual int startRecording(const media::MediaRecorderConfiguration& config) = 0; + /** + * Stops recording the audio and video. + * + * @since v4.0.0 + * + * @note After calling \ref IMediaRecorder::startRecording "startRecording", if you want to stop the recording, + * you must call `stopRecording`; otherwise, the generated recording files might not be playable. + * + * + * @return + * - 0(ERR_OK): Success. + * - < 0: Failure: + */ + virtual int stopRecording() = 0; +}; + +} //namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaStreamingSource.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaStreamingSource.h new file mode 100644 index 000000000..e1267b683 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMediaStreamingSource.h @@ -0,0 +1,332 @@ +// +// Agora SDK +// Copyright (c) 2019 Agora.io. All rights reserved. +// +// Created by xiaohua.lu in 2020-03. +// CodeStyle: Google C++ +// + +#pragma once // NOLINT(build/header_guard) + + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "AgoraMediaPlayerTypes.h" +#include "AgoraRefPtr.h" + +namespace agora { +namespace rtc { + + +class IMediaStreamingSourceObserver; + + +/** + * @brief The error code of streaming source + * + */ +enum STREAMING_SRC_ERR { + STREAMING_SRC_ERR_NONE = 0, ///< no error + STREAMING_SRC_ERR_UNKNOWN = 1, ///< unknown error + STREAMING_SRC_ERR_INVALID_PARAM = 2, ///< invalid parameter + STREAMING_SRC_ERR_BAD_STATE = 3, ///< bad status + STREAMING_SRC_ERR_NO_MEM = 4, ///< not enough memory + STREAMING_SRC_ERR_BUFFER_OVERFLOW = 5, ///< buffer overflow + STREAMING_SRC_ERR_BUFFER_UNDERFLOW = 6, ///< buffer underflow + STREAMING_SRC_ERR_NOT_FOUND = 7, ///< buffer underflow + STREAMING_SRC_ERR_TIMEOUT = 8, ///< buffer underflow + STREAMING_SRC_ERR_EXPIRED = 9, ///< expired + STREAMING_SRC_ERR_UNSUPPORTED = 10, ///< unsupported + STREAMING_SRC_ERR_NOT_EXIST = 11, ///< component not exist + STREAMING_SRC_ERR_EXIST = 12, ///< component already exist + STREAMING_SRC_ERR_OPEN = 13, ///< fail to IO open + STREAMING_SRC_ERR_CLOSE = 14, ///< fail to IO close + STREAMING_SRC_ERR_READ = 15, ///< fail to IO read + STREAMING_SRC_ERR_WRITE = 16, ///< fail to IO write + STREAMING_SRC_ERR_SEEK = 17, ///< fail to IO seek + STREAMING_SRC_ERR_EOF = 18, ///< reach to IO EOF, can do nothing + STREAMING_SRC_ERR_CODECOPEN = 19, ///< fail to codec open + STREAMING_SRC_ERR_CODECCLOSE = 20, ///< fail to codec close + STREAMING_SRC_ERR_CODECPROC = 21, ///< fail to codec process +}; + + + +/** + * @brief The state machine of Streaming Source + * + */ +enum STREAMING_SRC_STATE { + STREAMING_SRC_STATE_CLOSED = 0, ///< streaming source still closed, can do nothing + STREAMING_SRC_STATE_OPENING = 1, ///< after call open() method and start parsing streaming source + STREAMING_SRC_STATE_IDLE = 2, ///< streaming source is ready waiting for play + STREAMING_SRC_STATE_PLAYING = 3, ///< after call play() method, playing & pushing the AV data + STREAMING_SRC_STATE_SEEKING = 4, ///< after call seek() method, start seeking poisition + STREAMING_SRC_STATE_EOF = 5, ///< The position is located at end, can NOT playing + STREAMING_SRC_STATE_ERROR = 6, ///< The error status and can do nothing except close +}; + + +/** + * @brief The input SEI data + * + */ +struct InputSeiData { + int32_t type; ///< SEI type + int64_t timestamp; ///< the frame timestamp which be attached + int64_t frame_index; ///< the frame index which be attached + uint8_t* private_data; ///< SEI really data + int32_t data_size; ///< size of really data +}; + + + +/** + * @brief The IMediaStreamingSource class provides access to a media streaming source demuxer. + * To playout multiple stream sources simultaneously, + * create multiple media stream source objects. + */ +class IMediaStreamingSource : public RefCountInterface { +public: + virtual ~IMediaStreamingSource() {}; + + + /** + * @brief Opens a media streaming source with a specified URL. + * @param url The path of the media file. Both the local path and online path are supported. + * @param startPos The starting position (ms) for pushing. Default value is 0. + * @param auto_play whether start playing after opened + * @return + * - 0: Success. + * - < 0: Failure + */ + virtual int open(const char* url, int64_t start_pos, bool auto_play = true) = 0; + + /** + * @brief Close current media streaming source + * @return + * - 0: Success. + * - < 0: Failure + */ + virtual int close() = 0; + + /** + * @brief Gets the unique source ID of the streaming source. + * @return + * - 鈮 0: The source ID of this media player source. + * - < 0: Failure. + */ + virtual int getSourceId() const = 0; + + /** + * @brief Retrieve whether video stream is valid + * @return: valid or invalid + */ + virtual bool isVideoValid() = 0; + + /** + * @brief Retrieve whether audio stream is valid + * @return: valid or invalid + */ + virtual bool isAudioValid() = 0; + + /** + * @brief Gets the duration of the streaming source. + * @param [out] duration A reference to the duration of the media file. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int getDuration(int64_t& duration) = 0; + + /** + * @brief Gets the number of the streming source + * @param [out] count The number of the media streams in the media source. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int getStreamCount(int64_t& count) = 0; + + /** + * @brief Gets the detailed information of a media stream. + * @param index The index of the media stream. + * @param [out] out_info The detailed information of the media stream. See \ref media::base::PlayerStreamInfo "PlayerStreamInfo" for details. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int getStreamInfo(int64_t index, media::base::PlayerStreamInfo* out_info) = 0; + + /** + * @brief Sets whether to loop the streaming source for playback. + * @param loop_count The number of times of looping the media file. + * - 1: Play the media file once. + * - 2: Play the media file twice. + * - <= 0: Play the media file in a loop indefinitely, until {@link stop} is called. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int setLoopCount(int64_t loop_count) = 0; + + /** + * @brief Play & push the streaming source. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int play() = 0; + + /** + * @brief Pauses the playing & pushing of the streaming source, Keep current position. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int pause() = 0; + + /** + * @brief Stop the playing & pushing of the streaming source, set the position to 0. + * @return + * - 0: Success. + * - < 0: Failure.See {@link STREAMINGSRC_ERR}. + */ + virtual int stop() = 0; + + /** + * @brief Sets the playback position of the streaming source. + * After seek done, it will return to previous status + * @param newPos The new playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int seek(int64_t new_pos) = 0; + + /** + * @brief Gets the current playback position of the media file. + * @param [out] pos A reference to the current playback position (ms). + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int getCurrPosition(int64_t& pos) = 0; + + /** + * @breif Gets the status of current streaming source. + * @return The current state machine + */ + virtual STREAMING_SRC_STATE getCurrState() = 0; + + /** + * @brief append the SEI data which can be sent attached to video packet + * @param type SEI type + * @param timestamp the video frame timestamp which attached to + * @param frame_index the video frame timestamp which attached to + * + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int appendSeiData(const InputSeiData& inSeiData) = 0; + + /** + * Registers a media player source observer. + * + * Once the media player source observer is registered, you can use the observer to monitor the state change of the media player. + * @param observer The pointer to the IMediaStreamingSource object. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int registerObserver(IMediaStreamingSourceObserver* observer) = 0; + + /** + * Releases the media player source observer. + * @param observer The pointer to the IMediaStreamingSource object. + * @return + * - 0: Success. + * - < 0: Failure. See {@link STREAMINGSRC_ERR}. + */ + virtual int unregisterObserver(IMediaStreamingSourceObserver* observer) = 0; + + /** + * @brief Parse a media information with a specified URL. + * @param url : The path of the media file. Both the local path and online path are supported. + * @param video_info : The output video information, It means no video track while video_info.streamIndex less than 0 + * @param audio_info : The output audio information, It means no audio track while audio_info.streamIndex less than 0 + * @return + * - 0: Success. + * - < 0: Failure + */ + virtual int parseMediaInfo(const char* url, + media::base::PlayerStreamInfo& video_info, + media::base::PlayerStreamInfo& audio_info) = 0; + +}; + + + +/** + * @brief This observer interface of media streaming source + */ +class IMediaStreamingSourceObserver { + public: + virtual ~IMediaStreamingSourceObserver() {}; + + + /** + * @brief Reports the playback state change. + * When the state of the playback changes, + * the SDK triggers this callback to report the new playback state + * and the reason or error for the change. + * @param state The new playback state after change. See {@link STREAMING_SRC_STATE}. + * @param ec The player's error code. See {@link STREAMINGSRC_ERR}. + */ + virtual void onStateChanged(STREAMING_SRC_STATE state, STREAMING_SRC_ERR err_code) = 0; + + /** + * @brief Triggered when file is opened + * @param err_code The error code + * @return None + */ + virtual void onOpenDone(STREAMING_SRC_ERR err_code) = 0; + + /** + * @brief Triggered when seeking is done + * @param err_code The error code + * @return None + */ + virtual void onSeekDone(STREAMING_SRC_ERR err_code) = 0; + + /** + * @brief Triggered when playing is EOF + * @param progress_ms the progress position + * @param repeat_count means repeated count of playing + */ + virtual void onEofOnce(int64_t progress_ms, int64_t repeat_count) = 0; + + /** + * @brief Reports current playback progress. + * The callback triggered once every one second during the playing status + * @param position_ms Current playback progress (millisecond). + */ + virtual void onProgress(int64_t position_ms) = 0; + + /** + * @brief Occurs when the metadata is received. + * The callback occurs when the player receives the media metadata + * and reports the detailed information of the media metadata. + * @param data The detailed data of the media metadata. + * @param length The data length (bytes). + */ + virtual void onMetaData(const void* data, int length) = 0; + +}; + + + +} //namespace rtc +} // namespace agora + diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMusicContentCenter.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMusicContentCenter.h new file mode 100644 index 000000000..d5ed99ef8 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraMusicContentCenter.h @@ -0,0 +1,538 @@ +// +// Agora Media SDK +// +// Created by FanYuanYuan in 2022-05. +// Copyright (c) 2022 Agora IO. All rights reserved. +// + +#pragma once + +#include "AgoraRefPtr.h" +#include "IAgoraMediaPlayer.h" + +namespace agora { +namespace rtc { + +typedef enum +{ + /** + * 0: No error occurs and preload succeeds. + */ + kPreloadStatusCompleted = 0, + + /** + * 1: A general error occurs. + */ + kPreloadStatusFailed = 1, + + /** + * 2: The media file is preloading. + */ + kPreloadStatusPreloading = 2, + /** + * 3: The media file is removed. + */ + kPreloadStatusRemoved = 3, +} PreloadStatusCode; + +typedef enum +{ + /** + * 0: No error occurs and request succeeds. + */ + kMusicContentCenterStatusOk = 0, + /** + * 1: A general error occurs. + */ + kMusicContentCenterStatusError = 1, + /** + * 2: The gateway error. There are several possible reasons: + * - Token is expired. Check if your token is expired. + * - Token is invalid. Check the type of token you passed in. + * - Network error. Check your network. + */ + kMusicContentCenterStatusGateway = 2, + /** + * 3: Permission and resource error. There are several possible reasons: + * - Your appid may not have the mcc permission. Please contact technical support + * - The resource may not exist. Please contact technical support + */ + kMusicContentCenterStatusPermissionAndResource = 3, + /** + * 4: Internal data parse error. Please contact technical support + */ + kMusicContentCenterStatusInternalDataParse = 4, + /** + * 5: Music loading error. Please contact technical support + */ + kMusicContentCenterStatusMusicLoading = 5, + /** + * 6: Music decryption error. Please contact technical support + */ + kMusicContentCenterStatusMusicDecryption = 6, + /** + * 7: Http internal error. Please retry later. + */ + kMusicContentCenterStatusHttpInternalError = 7, +} MusicContentCenterStatusCode; + +typedef struct +{ + /** + * Name of the music chart + */ + const char* chartName; + /** + * Id of the music chart, which is used to get music list + */ + int32_t id; +} MusicChartInfo; + +enum MUSIC_CACHE_STATUS_TYPE { + /** + * 0: Music is already cached. + */ + MUSIC_CACHE_STATUS_TYPE_CACHED = 0, + /** + * 1: Music is being cached. + */ + MUSIC_CACHE_STATUS_TYPE_CACHING = 1 +}; + +struct MusicCacheInfo { + /** + * The songCode of music. + */ + int64_t songCode; + /** + * The cache status of the music. + */ + MUSIC_CACHE_STATUS_TYPE status; + MusicCacheInfo():songCode(0), status(MUSIC_CACHE_STATUS_TYPE_CACHED) {} +}; + +class MusicChartCollection : public RefCountInterface { +public: + virtual int getCount() = 0; + virtual MusicChartInfo* get(int index) = 0; +protected: + virtual ~MusicChartCollection() = default; +}; + +struct MvProperty +{ + /** + * The resolution of the mv + */ + const char* resolution; + /** + * The bandwidth of the mv + */ + const char* bandwidth; +}; + +struct ClimaxSegment +{ + /** + * The start time of climax segment + */ + int32_t startTimeMs; + /** + * The end time of climax segment + */ + int32_t endTimeMs; +}; + +struct Music +{ + /** + * The songCode of music + */ + int64_t songCode; + /** + * The name of music + */ + const char* name; + /** + * The singer of music + */ + const char* singer; + /** + * The poster url of music + */ + const char* poster; + /** + * The release time of music + */ + const char* releaseTime; + /** + * The duration (in seconds) of music + */ + int32_t durationS; + /** + * The type of music + * 1, mp3 with instrumental accompaniment and original + * 2, mp3 only with instrumental accompaniment + * 3, mp3 only with original + * 4, mp4 with instrumental accompaniment and original + * 5, mv only + * 6, new type mp4 with instrumental accompaniment and original + * detail at document of music media center + */ + int32_t type; + /** + * The pitch type of music. + * 1, xml lyric has pitch + * 2, lyric has no pitch + */ + int32_t pitchType; + /** + * The number of lyrics available for the music + */ + int32_t lyricCount; + /** + * The lyric list of music + * 0, xml + * 1, lrc + */ + int32_t* lyricList; + /** + * The number of climax segments of the music + */ + int32_t climaxSegmentCount; + /** + * The climax segment list of music + */ + ClimaxSegment* climaxSegmentList; + /** + * The number of mv of the music + * If this value is greater than zero, it means the current music has MV resource + */ + int32_t mvPropertyCount; + /** + * The mv property list of music + */ + MvProperty* mvPropertyList; +}; + +class MusicCollection : public RefCountInterface { +public: + virtual int getCount() = 0; + virtual int getTotal() = 0; + virtual int getPage() = 0; + virtual int getPageSize() = 0; + virtual Music* getMusic(int32_t index) = 0; +protected: + virtual ~MusicCollection() = default; +}; + + +class IMusicContentCenterEventHandler { +public: + /** + * The music chart result callback; occurs when getMusicCharts method is called. + * + * @param requestId The request id is same as that returned by getMusicCharts. + * @param result The result of music chart collection + * @param status The status of the request. See MusicContentCenterStatusCode + */ + virtual void onMusicChartsResult(const char* requestId, agora_refptr result, MusicContentCenterStatusCode status) = 0; + + /** + * Music collection, occurs when getMusicCollectionByMusicChartId or searchMusic method is called. + * + * @param requestId The request id is same as that returned by getMusicCollectionByMusicChartId or searchMusic + * @param result The result of music collection + * @param status The status of the request. See MusicContentCenterStatusCode + */ + virtual void onMusicCollectionResult(const char* requestId, agora_refptr result, MusicContentCenterStatusCode status) = 0; + + /** + * Lyric url callback of getLyric, occurs when getLyric is called + * + * @param requestId The request id is same as that returned by getLyric + * @param songCode Song code + * @param lyricUrl The lyric url of this music + * @param status The status of the request. See MusicContentCenterStatusCode + */ + virtual void onLyricResult(const char* requestId, int64_t songCode, const char* lyricUrl, MusicContentCenterStatusCode status) = 0; + + /** + * Simple info callback of getSongSimpleInfo, occurs when getSongSimpleInfo is called + * + * @param requestId The request id is same as that returned by getSongSimpleInfo. + * @param songCode Song code + * @param simpleInfo The metadata of the music. + * @param status The status of the request. See MusicContentCenterStatusCode + */ + virtual void onSongSimpleInfoResult(const char* requestId, int64_t songCode, const char* simpleInfo, MusicContentCenterStatusCode status) = 0; + + /** + * Preload process callback, occurs when preload is called + * + * @param requestId The request id is same as that returned by preload. + * @param songCode Song code + * @param percent Preload progress (0 ~ 100) + * @param lyricUrl The lyric url of this music + * @param preloadStatus Preload status; see PreloadStatusCode. + * @param mccStatus The status of the request. See MusicContentCenterStatusCode + */ + virtual void onPreLoadEvent(const char* requestId, int64_t songCode, int percent, const char* lyricUrl, PreloadStatusCode preloadStatus, MusicContentCenterStatusCode mccStatus) = 0; + + virtual ~IMusicContentCenterEventHandler() {}; +}; + +struct MusicContentCenterConfiguration { + /** + * The app ID of the project that has enabled the music content center + */ + const char *appId; + /** + * Music content center need token to connect with server + */ + const char *token; + /** + * The user ID when using music content center. It can be different from that of the rtc product. + */ + int64_t mccUid; + /** + * The max number which the music content center caches cannot exceed 50. + */ + int32_t maxCacheSize; + /** + * @technical preview + */ + const char* mccDomain; + /** + * Event handler to get callback result. + */ + IMusicContentCenterEventHandler* eventHandler; + MusicContentCenterConfiguration():appId(nullptr),token(nullptr),eventHandler(nullptr),mccUid(0),maxCacheSize(10), mccDomain(nullptr){} + MusicContentCenterConfiguration(const char*appid,const char* token,int64_t id,IMusicContentCenterEventHandler* handler,int32_t maxSize = 10, const char* apiurl = nullptr): + appId(appid),token(token),mccUid(id),eventHandler(handler),maxCacheSize(maxSize), mccDomain(apiurl){} +}; + +class IMusicPlayer : public IMediaPlayer { +protected: + virtual ~IMusicPlayer() {}; + +public: + IMusicPlayer() {}; + using IMediaPlayer::open; + /** + * Open a media file with specified parameters. + * + * @param songCode The identifier of the media file that you want to play. + * @param startPos The playback position (ms) of the music file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int open(int64_t songCode, int64_t startPos = 0) = 0; +}; + +class IMusicContentCenter +{ +protected: + virtual ~IMusicContentCenter(){}; +public: + IMusicContentCenter() {}; + + /** + * Initializes the IMusicContentCenter + * Set token of music content center and other params + * + * @param configuration + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int initialize(const MusicContentCenterConfiguration & configuration) = 0; + + /** + * Renew token of music content center + * + * @param token The new token. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int renewToken(const char* token) = 0; + + /** + * release music content center resource. + * + */ + virtual void release() = 0; + + /** + * register event handler. + */ + virtual int registerEventHandler(IMusicContentCenterEventHandler* eventHandler) = 0; + + /** + * unregister event handler. + */ + virtual int unregisterEventHandler() = 0; + + /** + * Creates a music player source object and return its pointer. + * @return + * - The pointer to \ref rtc::IMusicPlayer "IMusicPlayer", + * if the method call succeeds. + * - The empty pointer NULL, if the method call fails. + */ + virtual agora_refptr createMusicPlayer() = 0; + + /** + * Get music chart collection of music. + * If the method call succeeds, get result from the + * \ref agora::rtc::IMusicContentCenterEventHandler::onMusicChartsResult + * "onMusicChartsResult" callback + * @param requestId The request id you will get of this query, format is uuid. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getMusicCharts(agora::util::AString& requestId) = 0; + + /** + * Get music collection of the music chart by musicChartId and page info. + * If the method call success, get result from the + * \ref agora::rtc::IMusicContentCenterEventHandler::onMusicCollectionResult + * "onMusicCollectionResult" callback + * @param requestId The request id you will get of this query, format is uuid. + * @param musicChartId The music chart id obtained from getMusicCharts. + * @param page The page of the music chart, starting from 1 + * @param pageSize The page size, max is 50. + * @param jsonOption The ext param, default is null. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getMusicCollectionByMusicChartId(agora::util::AString& requestId, int32_t musicChartId, int32_t page, int32_t pageSize, const char* jsonOption = nullptr) = 0; + + /** + * Search music by keyword and page info. + * If the method call success, get result from the + * \ref agora::rtc::IMusicContentCenterEventHandler::onMusicCollectionResult + * "onMusicCollectionResult" callback + * @param requestId The request id you will get of this query, format is uuid. + * @param keyWord The key word to search. + * @param page The page of music search result , start from 1. + * @param pageSize The page size, max is 50. + * @param jsonOption The ext param, default is null. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int searchMusic(agora::util::AString& requestId, const char* keyWord, int32_t page, int32_t pageSize, const char* jsonOption = nullptr) = 0; + + /** + * Preload a media file with specified parameters. + * + * @deprecated This method is deprecated. Use preload(int64_t songCode) instead. + * + * @param songCode The identifier of the media file that you want to play. + * @param jsonOption The ext param, default is null. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int preload(int64_t songCode, const char* jsonOption) __deprecated = 0; + + /** + * Preload a media file with specified parameters. + * + * @param requestId The request id you will get of this query, format is uuid. + * @param songCode The identifier of the media file that you want to play. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int preload(agora::util::AString& requestId, int64_t songCode) = 0; + + /** + * Remove a media file cache + * + * @param songCode The identifier of the media file that you want to play. + * @return + * - 0: Success; the cached media file is removed. + * - < 0: Failure. + */ + virtual int removeCache(int64_t songCode) = 0; + + /** + * Get cached media files. + * Before calling this API, you should allocate a memory buffer that stores the cached media file information, and pass the pointer of the buffer as the input parameter cacheInfo, and set the size of the memory buffer to cacheInfoSize. + * The sample code below illustrates how to request the cached media file information: + * + * cacheInfoSize = 10 // Allocate a memory buffer of 10 MusicCacheInfo size + * agora::rtc::MusicCacheInfo *infos = new agora::rtc::MusicCacheInfo[cacheInfoSize]; + * int ret = self.imcc->getCaches(infos, cacheInfoSize); + * if (ret < 0) { // error occurred! + * return; + * } + * std::cout << "the cache size:" << cacheInfoSize << std::endl; // The cache size: 5 + * + * + * @param cacheInfo An output parameter; A pointer to the memory buffer that stores the cached media file information. The memory buffer pointed to by cacheInfo should be allocated by yourself before calling this API. + * @param cacheInfoSize + * - Input: The number of MusicCacheInfo's size that you get from the memory. + * - Output: The actual number of MusicCacheInfo struct that is returned. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getCaches(MusicCacheInfo *cacheInfo, int32_t* cacheInfoSize) = 0; + + /** + * Check if the media file is preloaded + * + * @param songCode The identifier of the media file that you want to play. + * @return + * - 0: Success, file is preloaded. + * - < 0: Failure. + */ + virtual int isPreloaded(int64_t songCode) = 0; + + /** + * Get lyric of the music. + * + * @param requestId The request id you will get of this query, format is uuid. + * @param songCode The identifier of the media file that you want to play. + * @param LyricType The type of the lyric file. 0:xml or 1:lrc. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getLyric(agora::util::AString& requestId, int64_t songCode, int32_t LyricType = 0) = 0; + + /** + * Gets the metadata of a specific music. Once this method is called, the SDK triggers the onSongSimpleInfoResult callback to report the metadata of the music. + * + * @param requestId The request id you will get of this query, format is uuid. + * @param songCode The identifier of the media file. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getSongSimpleInfo(agora::util::AString& requestId, int64_t songCode) = 0; + + /** + * Get internal songCodeKey from songCode and jsonOption + * + * @param songCode The identifier of the media file. + * @param jsonOption An extention parameter. The default value is null. it鈥檚 a json-format string and the `key` and `value` can be customized according to your scenarios. + * @param internalSongCode The identifier of internal + * @return + * - 0: Success. + * - < 0: Failure. + */ + + virtual int getInternalSongCode(int64_t songCode, const char* jsonOption, int64_t& internalSongCode) = 0; + +}; + +} // namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraParameter.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraParameter.h new file mode 100644 index 000000000..b88969e1d --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraParameter.h @@ -0,0 +1,310 @@ +// +// Agora Engine SDK +// +// Created by minbo in 2019-10. +// Copyright (c) 2019 Agora.io. All rights reserved. + +/* + * Copyright (c) 2012 The WebRTC project authors. All Rights Reserved. + * + * Use of this source code is governed by a BSD-style license + * that can be found in the LICENSE file in the root of the source + * tree. An additional intellectual property rights grant can be found + * in the file PATENTS. All contributing project authors may + * be found in the AUTHORS file in the root of the source tree. + */ + +#pragma once // NOLINT(build/header_guard) +#include "AgoraRefPtr.h" + +// external key +/** + * set the range of ports available for connection + * @example "{\"rtc.udp_port_range\":[4500, 5000]}" + */ +#define KEY_RTC_UDP_PORT_RANGE "rtc.udp_port_range" +/** + * set the list of ports available for connection + * @example "{\"rtc.udp_port_list\":[4501, 4502, 4503, 4504, 4505, 4506]}" + */ +#define KEY_RTC_UDP_PORT_LIST "rtc.udp_port_list" + +/** + * get the fd of sending socket available for connection + * note: set method is not supported. + */ +#define KEY_RTC_UDP_SEND_FD "rtc.udp_send_fd" + + /** + * set the video encoder mode (hardware or software) + */ +#define KEY_RTC_VIDEO_ENABLED_HW_ENCODER "engine.video.enable_hw_encoder" +#define KEY_RTC_VIDEO_HARDWARE_ENCODEING "che.hardware_encoding" +#define KEY_RTC_VIDEO_H264_HWENC "che.video.h264.hwenc" + /** + * set the hardware video encoder provider (nv for nvidia or qsv for intel) + */ +#define KEY_RTC_VIDEO_HW_ENCODER_PROVIDER "engine.video.hw_encoder_provider" + + /** + * set the video decoder mode (hardware or software) + */ +#define KEY_RTC_VIDEO_ENABLED_HW_DECODER "engine.video.enable_hw_decoder" +#define KEY_RTC_VIDEO_HARDWARE_DECODING "che.hardware_decoding" + + /** + * set the hardware video decoder provider (h264_cuvid(default) or h264_qsv) + */ +#define KEY_RTC_VIDEO_HW_DECODER_PROVIDER "engine.video.hw_decoder_provider" + + /** + * override the lua policy + */ +#define KEY_RTC_VIDEO_OVERRIDE_SMALLVIDEO_NOT_USE_HWENC_POLICY "engine.video.override_smallvideo_not_use_hwenc_policy" + +/** + * enable/disable video packet retransmission, enabled by default + */ +#define KEY_RTC_VIDEO_RESEND "rtc.video_resend" + +/** + * enable/disable audio packet retransmission, enabled by default +*/ +#define KEY_RTC_AUDIO_RESEND "rtc.audio_resend" + +/** + * set the bitrate ratio for video +*/ +#define KEY_RTC_VIDEO_BITRATE_ADJUST_RATIO "rtc.video.bitrate_adjust_ratio" + +/** + * set the minbitrate / bitrate ratio for video +*/ +#define KEY_RTC_VIDEO_MINBITRATE_RATIO "rtc.video.minbitrate_ratio" + +/** + * set the degradation preference +*/ +#define KEY_RTC_VIDEO_DEGRADATION_PREFERENCE "rtc.video.degradation_preference" + +/** + * set the degradation fps down step +*/ + +#define KEY_RTC_VIDEO_DEGRADATION_FPS_DOWN_STEP "rtc.video.degradation_fps_down_step" +/** + * set the degradation fps up step +*/ +#define KEY_RTC_VIDEO_DEGRADATION_FPS_UP_STEP "rtc.video.degradation_fps_up_step" + +/** + * set the duration ms for connection lost callback +*/ +#define KEY_RTC_CONNECTION_LOST_PERIOD "rtc.connection_lost_period" + +/** + * set the local ip +*/ +#define KEY_RTC_LOCAL_IP "rtc.local.ip" + +/** + * set the network interface +*/ +#define KEY_RTC_NETWORK_INTERFACE "rtc.network.interface" + +/** + * set the video codec type, such as "H264", "JPEG" + */ +#define KEY_RTC_VIDEO_CODEC_TYPE "engine.video.codec_type" +#define KEY_RTC_VIDEO_MINOR_STREAM_CODEC_TYPE "engine.video.minor_stream_codec_type" +#define KEY_RTC_VIDEO_CODEC_INDEX "che.video.videoCodecIndex" +/** + * only use average QP for quality scaling +*/ +#define KEY_RTC_VIDEO_QUALITY_SCALE_ONLY_ON_AVERAGE_QP "engine.video.quality_scale_only_on_average_qp" + +/** + * low bound for quality scaling +*/ +#define KEY_RTC_VIDEO_H264_QP_THRESHOLD_LOW "engine.video.h264_qp_thresholds_low" + +/** + * high bound for quality scaling +*/ +#define KEY_RTC_VIDEO_H264_QP_THRESHOLD_HIGH "engine.video.h264_qp_thresholds_high" + + +namespace agora { + +namespace util { +template +class CopyableAutoPtr; + +class IString; +typedef CopyableAutoPtr AString; +} // namespace util + +namespace base { + +class IAgoraParameter : public RefCountInterface { + public: + /** + * release the resource + */ + virtual void release() = 0; + + /** + * set bool value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setBool(const char* key, bool value) = 0; + + /** + * set int value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setInt(const char* key, int value) = 0; + + /** + * set unsigned int value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setUInt(const char* key, unsigned int value) = 0; + + /** + * set double value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setNumber(const char* key, double value) = 0; + + /** + * set string value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setString(const char* key, const char* value) = 0; + + /** + * set object value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setObject(const char* key, const char* value) = 0; + + /** + * set array value of the json + * @param [in] key + * the key name + * @param [in] value + * the value + * @return return 0 if success or an error code + */ + virtual int setArray(const char* key, const char* value) = 0; + /** + * get bool value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getBool(const char* key, bool& value) = 0; + + /** + * get int value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getInt(const char* key, int& value) = 0; + + /** + * get unsigned int value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getUInt(const char* key, unsigned int& value) = 0; + + /** + * get double value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getNumber(const char* key, double& value) = 0; + + /** + * get string value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getString(const char* key, agora::util::AString& value) = 0; + + /** + * get a child object value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getObject(const char* key, agora::util::AString& value) = 0; + + /** + * get array value of the json + * @param [in] key + * the key name + * @param [in, out] value + * the value + * @return return 0 if success or an error code + */ + virtual int getArray(const char* key, const char* args, agora::util::AString& value) = 0; + + /** + * set parameters of the sdk or engine + * @param [in] parameters + * the parameters + * @return return 0 if success or an error code + */ + virtual int setParameters(const char* parameters) = 0; + + virtual int convertPath(const char* filePath, agora::util::AString& value) = 0; + + protected: + virtual ~IAgoraParameter() {} +}; + +} // namespace base +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRhythmPlayer.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRhythmPlayer.h new file mode 100644 index 000000000..e2e00ac70 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRhythmPlayer.h @@ -0,0 +1,92 @@ +// +// Agora SDK +// +// Copyright (c) 2021 Agora.io. All rights reserved. +// + +#pragma once // NOLINT(build/header_guard) + +#include "AgoraBase.h" +#include "AgoraRefPtr.h" + +namespace agora { +namespace base { +class IAgoraService; +} + +namespace rtc { +class ILocalAudioTrack; +class IRtcEngineEventHandler; + +/** + The states of the rhythm player. + */ +enum RHYTHM_PLAYER_STATE_TYPE { + /** 810: The rhythm player is idle. */ + RHYTHM_PLAYER_STATE_IDLE = 810, + /** 811: The rhythm player is opening files. */ + RHYTHM_PLAYER_STATE_OPENING, + /** 812: Files opened successfully, the rhythm player starts decoding files. */ + RHYTHM_PLAYER_STATE_DECODING, + /** 813: Files decoded successfully, the rhythm player starts mixing the two files and playing back them locally. */ + RHYTHM_PLAYER_STATE_PLAYING, + /** 814: The rhythm player is starting to fail, and you need to check the error code for detailed failure reasons. */ + RHYTHM_PLAYER_STATE_FAILED, +}; + +/** + The reason codes of the rhythm player. + */ +enum RHYTHM_PLAYER_REASON { + /** 0: The rhythm player works well. */ + RHYTHM_PLAYER_REASON_OK = 0, + /** 1: The rhythm player occurs a internal error. */ + RHYTHM_PLAYER_REASON_FAILED = 1, + /** 801: The rhythm player can not open the file. */ + RHYTHM_PLAYER_REASON_CAN_NOT_OPEN = 801, + /** 802: The rhythm player can not play the file. */ + RHYTHM_PLAYER_REASON_CAN_NOT_PLAY, + /** 803: The file duration over the limit. The file duration limit is 1.2 seconds */ + RHYTHM_PLAYER_REASON_FILE_OVER_DURATION_LIMIT, +}; + +/** + * The configuration of rhythm player, + * which is set in startRhythmPlayer or configRhythmPlayer. + */ +struct AgoraRhythmPlayerConfig { + /** + * The number of beats per measure. The range is 1 to 9. + * The default value is 4, + * which means that each measure contains one downbeat and three upbeats. + */ + int beatsPerMeasure; + /* + * The range is 60 to 360. + * The default value is 60, + * which means that the rhythm player plays 60 beats in one minute. + */ + int beatsPerMinute; + + AgoraRhythmPlayerConfig() : beatsPerMeasure(4), beatsPerMinute(60) {} +}; + +/** + * The IRhythmPlayer class provides access to a rhythm player entity. + * A rhythm player synthesizes beats, plays them locally, and pushes them remotely. + */ +class IRhythmPlayer : public RefCountInterface { +protected: + virtual ~IRhythmPlayer() {} + +public: + static agora_refptr Create(); + virtual int initialize(base::IAgoraService* agora_service, IRtcEngineEventHandler* event_handler, bool is_pass_thru_mode) = 0; + virtual int playRhythm(const char* sound1, const char* sound2, const AgoraRhythmPlayerConfig& config) = 0; + virtual int stopRhythm() = 0; + virtual int configRhythmPlayer(const AgoraRhythmPlayerConfig& config) = 0; + virtual agora_refptr getRhythmPlayerTrack() = 0; +}; + +} //namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngine.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngine.h new file mode 100644 index 000000000..f7e8999b2 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngine.h @@ -0,0 +1,8471 @@ +// +// Agora Rtc Engine SDK +// +// Copyright (c) 2018 Agora.io. All rights reserved. +// +#pragma once + +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "IAgoraLog.h" +#include "AgoraOptional.h" +#include "IAudioDeviceManager.h" +#include "IAgoraRhythmPlayer.h" +#include "IAgoraMediaEngine.h" +#include "IAgoraH265Transcoder.h" + +namespace agora { +namespace rtm { +class IStreamChannel; +} +namespace rtc { + +template +static void SetFrom(Optional* s, const Optional& o) { + if (o) { + *s = o; + } +} + +template +static void ReplaceBy(Optional* s, const Optional& o) { + *s = o; +} + +//class IAudioDeviceManager; + +/** + * The media device types. + */ +enum MEDIA_DEVICE_TYPE { + /** + * -1: Unknown device type. + */ + UNKNOWN_AUDIO_DEVICE = -1, + /** + * 0: The audio playback device. + */ + AUDIO_PLAYOUT_DEVICE = 0, + /** + * 1: The audio recording device. + */ + AUDIO_RECORDING_DEVICE = 1, + /** + * 2: The video renderer. + */ + VIDEO_RENDER_DEVICE = 2, + /** + * 3: The video capturer. + */ + VIDEO_CAPTURE_DEVICE = 3, + /** + * 4: The audio playback device of the app. + */ + AUDIO_APPLICATION_PLAYOUT_DEVICE = 4, + /** + * 5: The virtual audio playback device. + */ + AUDIO_VIRTUAL_PLAYOUT_DEVICE = 5, + /** + * 6: The virtual audio recording device. + */ + AUDIO_VIRTUAL_RECORDING_DEVICE = 6, +}; + +/** + The playback state of the music file. + */ +enum AUDIO_MIXING_STATE_TYPE { + /** 710: The music file is playing. */ + AUDIO_MIXING_STATE_PLAYING = 710, + /** 711: The music file pauses playing. */ + AUDIO_MIXING_STATE_PAUSED = 711, + /** 713: The music file stops playing. */ + AUDIO_MIXING_STATE_STOPPED = 713, + /** 714: An error occurs during the playback of the audio mixing file. + */ + AUDIO_MIXING_STATE_FAILED = 714, +}; + +/** + The reson codes of the local user's audio mixing file. + */ +enum AUDIO_MIXING_REASON_TYPE { + /** 701: The SDK cannot open the audio mixing file. */ + AUDIO_MIXING_REASON_CAN_NOT_OPEN = 701, + /** 702: The SDK opens the audio mixing file too frequently. */ + AUDIO_MIXING_REASON_TOO_FREQUENT_CALL = 702, + /** 703: The audio mixing file playback is interrupted. */ + AUDIO_MIXING_REASON_INTERRUPTED_EOF = 703, + /** 715: The audio mixing file is played once. */ + AUDIO_MIXING_REASON_ONE_LOOP_COMPLETED = 721, + /** 716: The audio mixing file is all played out. */ + AUDIO_MIXING_REASON_ALL_LOOPS_COMPLETED = 723, + /** 716: The audio mixing file stopped by user */ + AUDIO_MIXING_REASON_STOPPED_BY_USER = 724, + /** 0: The SDK can open the audio mixing file. */ + AUDIO_MIXING_REASON_OK = 0, +}; + +/** + * The status of importing an external video stream in a live broadcast. + */ +enum INJECT_STREAM_STATUS { + /** + * 0: The media stream is injected successfully. + */ + INJECT_STREAM_STATUS_START_SUCCESS = 0, + /** + * 1: The media stream already exists. + */ + INJECT_STREAM_STATUS_START_ALREADY_EXISTS = 1, + /** + * 2: The media stream injection is unauthorized. + */ + INJECT_STREAM_STATUS_START_UNAUTHORIZED = 2, + /** + * 3: Timeout occurs when injecting a media stream. + */ + INJECT_STREAM_STATUS_START_TIMEDOUT = 3, + /** + * 4: The media stream injection fails. + */ + INJECT_STREAM_STATUS_START_FAILED = 4, + /** + * 5: The media stream stops being injected successfully. + */ + INJECT_STREAM_STATUS_STOP_SUCCESS = 5, + /** + * 6: The media stream injection that you want to stop is found. + */ + INJECT_STREAM_STATUS_STOP_NOT_FOUND = 6, + /** + * 7: You are not authorized to stop the media stream injection. + */ + INJECT_STREAM_STATUS_STOP_UNAUTHORIZED = 7, + /** + * 8: Timeout occurs when you stop injecting the media stream. + */ + INJECT_STREAM_STATUS_STOP_TIMEDOUT = 8, + /** + * 9: Stopping injecting the media stream fails. + */ + INJECT_STREAM_STATUS_STOP_FAILED = 9, + /** + * 10: The media stream is broken. + */ + INJECT_STREAM_STATUS_BROKEN = 10, +}; + +/** + * The audio equalization band frequency. + */ +enum AUDIO_EQUALIZATION_BAND_FREQUENCY { + /** + * 0: 31 Hz. + */ + AUDIO_EQUALIZATION_BAND_31 = 0, + /** + * 1: 62 Hz. + */ + AUDIO_EQUALIZATION_BAND_62 = 1, + /** + * 2: 125 Hz. + */ + AUDIO_EQUALIZATION_BAND_125 = 2, + /** + * 3: 250 Hz. + */ + AUDIO_EQUALIZATION_BAND_250 = 3, + /** + * 4: 500 Hz. + */ + AUDIO_EQUALIZATION_BAND_500 = 4, + /** + * 5: 1 KHz. + */ + AUDIO_EQUALIZATION_BAND_1K = 5, + /** + * 6: 2 KHz. + */ + AUDIO_EQUALIZATION_BAND_2K = 6, + /** + * 7: 4 KHz. + */ + AUDIO_EQUALIZATION_BAND_4K = 7, + /** + * 8: 8 KHz. + */ + AUDIO_EQUALIZATION_BAND_8K = 8, + /** + * 9: 16 KHz. + */ + AUDIO_EQUALIZATION_BAND_16K = 9, +}; + +/** + * The audio reverberation type. + */ +enum AUDIO_REVERB_TYPE { + /** + * 0: (-20 to 10 dB), the level of the dry signal. + */ + AUDIO_REVERB_DRY_LEVEL = 0, + /** + * 1: (-20 to 10 dB), the level of the early reflection signal (wet signal). + */ + AUDIO_REVERB_WET_LEVEL = 1, + /** + * 2: (0 to 100 dB), the room size of the reflection. + */ + AUDIO_REVERB_ROOM_SIZE = 2, + /** + * 3: (0 to 200 ms), the length of the initial delay of the wet signal in ms. + */ + AUDIO_REVERB_WET_DELAY = 3, + /** + * 4: (0 to 100), the strength of the late reverberation. + */ + AUDIO_REVERB_STRENGTH = 4, +}; + +enum STREAM_FALLBACK_OPTIONS { + /** 0: No fallback operation for the stream when the network + condition is poor. The stream quality cannot be guaranteed. */ + + STREAM_FALLBACK_OPTION_DISABLED = 0, + /** 1: (Default) Under poor network conditions, the SDK will send or receive + agora::rtc::VIDEO_STREAM_LOW. You can only set this option in + RtcEngineParameters::setRemoteSubscribeFallbackOption. Nothing happens when + you set this in RtcEngineParameters::setLocalPublishFallbackOption. */ + STREAM_FALLBACK_OPTION_VIDEO_STREAM_LOW = 1, + /** 2: Under poor network conditions, the SDK may receive + agora::rtc::VIDEO_STREAM_LOW first, but if the network still does + not allow displaying the video, the SDK will send or receive audio only. */ + STREAM_FALLBACK_OPTION_AUDIO_ONLY = 2, +}; + +enum PRIORITY_TYPE { + /** 50: High priority. + */ + PRIORITY_HIGH = 50, + /** 100: (Default) normal priority. + */ + PRIORITY_NORMAL = 100, +}; + +struct RtcConnection; + +/** Statistics of the local video stream. + */ +struct LocalVideoStats +{ + /** + * ID of the local user. + */ + uid_t uid; + /** The actual bitrate (Kbps) while sending the local video stream. + * @note This value does not include the bitrate for resending the video after packet loss. + */ + int sentBitrate; + /** The actual frame rate (fps) while sending the local video stream. + * @note This value does not include the frame rate for resending the video after packet loss. + */ + int sentFrameRate; + /** The capture frame rate (fps) of the local video. + */ + int captureFrameRate; + /** The width of the capture frame (px). + */ + int captureFrameWidth; + /** The height of the capture frame (px). + */ + int captureFrameHeight; + /** + * The regulated frame rate of capture frame rate according to video encoder configuration. + */ + int regulatedCaptureFrameRate; + /** + * The regulated frame width (pixel) of capture frame width according to video encoder configuration. + */ + int regulatedCaptureFrameWidth; + /** + * The regulated frame height (pixel) of capture frame height according to video encoder configuration. + */ + int regulatedCaptureFrameHeight; + /** The output frame rate (fps) of the local video encoder. + */ + int encoderOutputFrameRate; + /** The width of the encoding frame (px). + */ + int encodedFrameWidth; + /** The height of the encoding frame (px). + */ + int encodedFrameHeight; + /** The output frame rate (fps) of the local video renderer. + */ + int rendererOutputFrameRate; + /** The target bitrate (Kbps) of the current encoder. This is an estimate made by the SDK based on the current network conditions. + */ + int targetBitrate; + /** The target frame rate (fps) of the current encoder. + */ + int targetFrameRate; + /** Quality adaption of the local video stream in the reported interval (based on the target frame + * rate and target bitrate). See #QUALITY_ADAPT_INDICATION. + */ + QUALITY_ADAPT_INDICATION qualityAdaptIndication; + /** The bitrate (Kbps) while encoding the local video stream. + * @note This value does not include the bitrate for resending the video after packet loss. + */ + int encodedBitrate; + /** The number of the sent video frames, represented by an aggregate value. + */ + int encodedFrameCount; + /** The codec type of the local video. See #VIDEO_CODEC_TYPE. + */ + VIDEO_CODEC_TYPE codecType; + /** + * The video packet loss rate (%) from the local client to the Agora server before applying the anti-packet loss strategies. + */ + unsigned short txPacketLossRate; + /** The brightness level of the video image captured by the local camera. See #CAPTURE_BRIGHTNESS_LEVEL_TYPE. + */ + CAPTURE_BRIGHTNESS_LEVEL_TYPE captureBrightnessLevel; + /** + * Whether we send dual stream now. + */ + bool dualStreamEnabled; + /** The hwEncoderAccelerating of the local video: + * - software = 0. + * - hardware = 1. + */ + int hwEncoderAccelerating; +}; + +/** + * Audio statistics of the remote user. + */ +struct RemoteAudioStats +{ + /** + * User ID of the remote user sending the audio stream. + */ + uid_t uid; + /** + * The quality of the remote audio: #QUALITY_TYPE. + */ + int quality; + /** + * The network delay (ms) from the sender to the receiver. + */ + int networkTransportDelay; + /** + * The network delay (ms) from the receiver to the jitter buffer. + * @note When the receiving end is an audience member and `audienceLatencyLevel` of `ClientRoleOptions` + * is 1, this parameter does not take effect. + */ + int jitterBufferDelay; + /** + * The audio frame loss rate in the reported interval. + */ + int audioLossRate; + /** + * The number of channels. + */ + int numChannels; + /** + * The sample rate (Hz) of the remote audio stream in the reported interval. + */ + int receivedSampleRate; + /** + * The average bitrate (Kbps) of the remote audio stream in the reported + * interval. + */ + int receivedBitrate; + /** + * The total freeze time (ms) of the remote audio stream after the remote + * user joins the channel. + * + * In a session, audio freeze occurs when the audio frame loss rate reaches 4%. + */ + int totalFrozenTime; + /** + * The total audio freeze time as a percentage (%) of the total time when the + * audio is available. + */ + int frozenRate; + /** + * The quality of the remote audio stream as determined by the Agora + * real-time audio MOS (Mean Opinion Score) measurement method in the + * reported interval. The return value ranges from 0 to 500. Dividing the + * return value by 100 gets the MOS score, which ranges from 0 to 5. The + * higher the score, the better the audio quality. + * + * | MOS score | Perception of audio quality | + * |-----------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------| + * | Greater than 4 | Excellent. The audio sounds clear and smooth. | + * | From 3.5 to 4 | Good. The audio has some perceptible impairment, but still sounds clear. | + * | From 3 to 3.5 | Fair. The audio freezes occasionally and requires attentive listening. | + * | From 2.5 to 3 | Poor. The audio sounds choppy and requires considerable effort to understand. | + * | From 2 to 2.5 | Bad. The audio has occasional noise. Consecutive audio dropouts occur, resulting in some information loss. The users can communicate only with difficulty. | + * | Less than 2 | Very bad. The audio has persistent noise. Consecutive audio dropouts are frequent, resulting in severe information loss. Communication is nearly impossible. | + */ + int mosValue; + /** + * If the packet loss concealment (PLC) occurs for N consecutive times, freeze is considered as PLC occurring for M consecutive times. + * freeze cnt = (n_plc - n) / m + */ + uint32_t frozenRateByCustomPlcCount; + /** + * The number of audio packet loss concealment + */ + uint32_t plcCount; + + /** + * The total time (ms) when the remote user neither stops sending the audio + * stream nor disables the audio module after joining the channel. + */ + int totalActiveTime; + /** + * The total publish duration (ms) of the remote audio stream. + */ + int publishDuration; + /** + * Quality of experience (QoE) of the local user when receiving a remote audio stream. See #EXPERIENCE_QUALITY_TYPE. + */ + int qoeQuality; + /** + * The reason for poor QoE of the local user when receiving a remote audio stream. See #EXPERIENCE_POOR_REASON. + */ + int qualityChangedReason; + /** + * The total number of audio bytes received (bytes), inluding the FEC bytes, represented by an aggregate value. + */ + unsigned int rxAudioBytes; + + RemoteAudioStats() + : uid(0), + quality(0), + networkTransportDelay(0), + jitterBufferDelay(0), + audioLossRate(0), + numChannels(0), + receivedSampleRate(0), + receivedBitrate(0), + totalFrozenTime(0), + frozenRate(0), + mosValue(0), + frozenRateByCustomPlcCount(0), + plcCount(0), + totalActiveTime(0), + publishDuration(0), + qoeQuality(0), + qualityChangedReason(0), + rxAudioBytes(0) {} +}; + +/** + * The statistics of the remote video stream. + */ +struct RemoteVideoStats { + /** + * ID of the remote user sending the video stream. + */ + uid_t uid; + /** + * @deprecated Time delay (ms). + * + * In scenarios where audio and video is synchronized, you can use the + * value of `networkTransportDelay` and `jitterBufferDelay` in `RemoteAudioStats` + * to know the delay statistics of the remote video. + */ + int delay __deprecated; + /** + * End-to-end delay from video capturer to video renderer. Hardware capture or render delay is excluded. + */ + int e2eDelay; + /** + * The width (pixels) of the video stream. + */ + int width; + /** + * The height (pixels) of the video stream. + */ + int height; + /** + * Bitrate (Kbps) received since the last count. + */ + int receivedBitrate; + /** The decoder output frame rate (fps) of the remote video. + */ + int decoderOutputFrameRate; + /** The render output frame rate (fps) of the remote video. + */ + int rendererOutputFrameRate; + /** The video frame loss rate (%) of the remote video stream in the reported interval. + */ + int frameLossRate; + /** Packet loss rate (%) of the remote video stream after using the anti-packet-loss method. + */ + int packetLossRate; + /** + * The type of the remote video stream: #VIDEO_STREAM_TYPE. + */ + VIDEO_STREAM_TYPE rxStreamType; + /** + The total freeze time (ms) of the remote video stream after the remote user joins the channel. + In a video session where the frame rate is set to no less than 5 fps, video freeze occurs when + the time interval between two adjacent renderable video frames is more than 500 ms. + */ + int totalFrozenTime; + /** + The total video freeze time as a percentage (%) of the total time when the video is available. + */ + int frozenRate; + /** + The offset (ms) between audio and video stream. A positive value indicates the audio leads the + video, and a negative value indicates the audio lags the video. + */ + int avSyncTimeMs; + /** + * The total time (ms) when the remote user neither stops sending the audio + * stream nor disables the audio module after joining the channel. + */ + int totalActiveTime; + /** + * The total publish duration (ms) of the remote audio stream. + */ + int publishDuration; + /** + * The quality of the remote video stream in the reported interval. + * The quality is determined by the Agora real-time video MOS (Mean Opinion Score) measurement method. + * The return value range is [0, 500]. + * Dividing the return value by 100 gets the MOS score, which ranges from 0 to 5. The higher the score, the better the video quality. + * @note For textured video data, this parameter always returns 0. + */ + int mosValue; + /** + * The total number of video bytes received (bytes), inluding the FEC bytes, represented by an aggregate value. + */ + unsigned int rxVideoBytes; +}; + +struct VideoCompositingLayout { + struct Region { + /** User ID of the user whose video is to be displayed in the region. + */ + uid_t uid; + /** Horizontal position of the region on the screen. + */ + double x; // [0,1] + /** Vertical position of the region on the screen. + */ + double y; // [0,1] + /** + Actual width of the region. + */ + double width; // [0,1] + /** Actual height of the region. */ + double height; // [0,1] + /** 0 means the region is at the bottom, and 100 means the region is at the + * top. + */ + int zOrder; // optional, [0, 100] //0 (default): bottom most, 100: top most + + /** 0 means the region is transparent, and 1 means the region is opaque. The + * default value is 1.0. + */ + double alpha; + + media::base::RENDER_MODE_TYPE renderMode; // RENDER_MODE_HIDDEN: Crop, RENDER_MODE_FIT: Zoom to fit + + Region() + : uid(0), + x(0), + y(0), + width(0), + height(0), + zOrder(0), + alpha(1.0), + renderMode(media::base::RENDER_MODE_HIDDEN) {} + }; + + /** Ignore this parameter. The width of the canvas is set by + agora::rtc::IRtcEngine::configPublisher, and not by + agora::rtc::VideoCompositingLayout::canvasWidth. + */ + int canvasWidth; + /** Ignore this parameter. The height of the canvas is set by + agora::rtc::IRtcEngine::configPublisher, and not by + agora::rtc::VideoCompositingLayout::canvasHeight. + */ + int canvasHeight; + /** Enter any of the 6-digit symbols defined in RGB. + */ + const char* backgroundColor; // e.g. "#C0C0C0" in RGB + /** Region array. Each host in the channel can have a region to display the + * video on the screen. + */ + const Region* regions; + /** Number of users in the channel. + */ + int regionCount; + /** User-defined data. + */ + const char* appData; + /** Length of the user-defined data. + */ + int appDataLength; + + VideoCompositingLayout() + : canvasWidth(0), + canvasHeight(0), + backgroundColor(OPTIONAL_NULLPTR), + regions(NULL), + regionCount(0), + appData(OPTIONAL_NULLPTR), + appDataLength(0) {} +}; + +/** The definition of InjectStreamConfig. + */ +struct InjectStreamConfig { + /** Width of the stream to be added into the broadcast. The default value is + 0; same width as the original stream. + */ + int width; + /** Height of the stream to be added into the broadcast. The default value is + 0; same height as the original stream. + */ + int height; + /** Video GOP of the stream to be added into the broadcast. The default value + is 30. + */ + int videoGop; + /** Video frame rate of the stream to be added into the broadcast. The + default value is 15 fps. + */ + int videoFramerate; + /** Video bitrate of the stream to be added into the broadcast. The default + value is 400 Kbps. + */ + int videoBitrate; + /** Audio-sampling rate of the stream to be added into the broadcast: + #AUDIO_SAMPLE_RATE_TYPE. The default value is 48000. + */ + AUDIO_SAMPLE_RATE_TYPE audioSampleRate; + /** Audio bitrate of the stream to be added into the broadcast. The default + value is 48. + */ + int audioBitrate; + /** Audio channels to be added into the broadcast. The default value is 1. + */ + int audioChannels; + + // width / height default set to 0 means pull the stream with its original + // resolution + InjectStreamConfig() + : width(0), + height(0), + videoGop(30), + videoFramerate(15), + videoBitrate(400), + audioSampleRate(AUDIO_SAMPLE_RATE_48000), + audioBitrate(48), + audioChannels(1) {} +}; + +/** The video stream lifecycle of CDN Live. + */ +enum RTMP_STREAM_LIFE_CYCLE_TYPE { + /** Bound to the channel lifecycle. + */ + RTMP_STREAM_LIFE_CYCLE_BIND2CHANNEL = 1, + /** Bound to the owner identity of the RTMP stream. + */ + RTMP_STREAM_LIFE_CYCLE_BIND2OWNER = 2, +}; + +/** The definition of PublisherConfiguration. +*/ +struct PublisherConfiguration { + /** Width of the output data stream set for CDN Live. The default value is + 360. + */ + int width; + /** Height of the output data stream set for CDN Live. The default value is + 640. + */ + int height; + /** Frame rate of the output data stream set for CDN Live. The default value + is 15 fps. + */ + int framerate; + /** Bitrate of the output data stream set for CDN Live. The default value is + 500 Kbps. + */ + int bitrate; + /** The default layout: + - 0: Tile horizontally + - 1: Layered windows + - 2: Tile vertically + */ + int defaultLayout; + /** The video stream lifecycle of CDN Live: RTMP_STREAM_LIFE_CYCLE_TYPE + */ + int lifecycle; + /** Whether the current user is the owner of the RTMP stream: + - True: Yes (default). The push-stream configuration takes effect. + - False: No. The push-stream configuration will not work. + */ + bool owner; + /** Width of the stream to be injected. Set it as 0. + */ + int injectStreamWidth; + /** Height of the stream to be injected. Set it as 0. + */ + int injectStreamHeight; + /** URL address of the stream to be injected to the channel. + */ + const char* injectStreamUrl; + /** Push-stream URL address for the picture-in-picture layouts. The default + value is NULL. + */ + const char* publishUrl; + /** Push-stream URL address of the original stream which does not require + picture-blending. The default value is NULL. + */ + const char* rawStreamUrl; + /** Reserved field. The default value is NULL. + */ + const char* extraInfo; + + PublisherConfiguration() + : width(640), + height(360), + framerate(15), + bitrate(500), + defaultLayout(1), + lifecycle(RTMP_STREAM_LIFE_CYCLE_BIND2CHANNEL), + owner(true), + injectStreamWidth(0), + injectStreamHeight(0), + injectStreamUrl(NULL), + publishUrl(NULL), + rawStreamUrl(NULL), + extraInfo(NULL) {} +}; + +/** + * The camera direction. + */ +enum CAMERA_DIRECTION { + /** The rear camera. */ + CAMERA_REAR = 0, + /** The front camera. */ + CAMERA_FRONT = 1, +}; + +/** The cloud proxy type. + * + * @since v3.3.0 + */ +enum CLOUD_PROXY_TYPE { + /** 0: Do not use the cloud proxy. + */ + NONE_PROXY = 0, + /** 1: The cloud proxy for the UDP protocol. + */ + UDP_PROXY = 1, + /// @cond + /** 2: The cloud proxy for the TCP (encrypted) protocol. + */ + TCP_PROXY = 2, + /// @endcond +}; + +/** Camera capturer configuration.*/ +struct CameraCapturerConfiguration { + /** Camera direction settings (for Android/iOS only). See: #CAMERA_DIRECTION. */ +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IOS) + /** + * The camera direction. + */ + CAMERA_DIRECTION cameraDirection; +#else + /** For windows. The device ID of the playback device. The maximum length is #MAX_DEVICE_ID_LENGTH. */ + char deviceId[MAX_DEVICE_ID_LENGTH]; +#endif + /** The video format. See VideoFormat. */ + VideoFormat format; + bool followEncodeDimensionRatio; + CameraCapturerConfiguration() : followEncodeDimensionRatio(true) { +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IOS) + cameraDirection = CAMERA_REAR; +#else + memset(deviceId, 0, sizeof(deviceId)); +#endif + } +}; +/** + * The configuration of the captured screen. + */ +struct ScreenCaptureConfiguration { + /** + * Whether to capture the window on the screen: + * - `true`: Capture the window. + * - `false`: (Default) Capture the screen, not the window. + */ + bool isCaptureWindow; // true - capture window, false - capture display + /** + * (macOS only) The display ID of the screen. + */ + uint32_t displayId; + /** + * (Windows only) The relative position of the shared screen to the virtual screen. + * @note This parameter takes effect only when you want to capture the screen on Windows. + */ + Rectangle screenRect; //Windows only + /** + * (For Windows and macOS only) The window ID. + * @note This parameter takes effect only when you want to capture the window. + */ + view_t windowId; + /** + * (For Windows and macOS only) The screen capture configuration. For details, see ScreenCaptureParameters. + */ + ScreenCaptureParameters params; + /** + * (For Windows and macOS only) The relative position of the shared region to the whole screen. For details, see Rectangle. + * + * If you do not set this parameter, the SDK shares the whole screen. If the region you set exceeds the boundary of the + * screen, only the region within in the screen is shared. If you set width or height in Rectangle as 0, the whole + * screen is shared. + */ + Rectangle regionRect; + + ScreenCaptureConfiguration() : isCaptureWindow(false), displayId(0), windowId(0) {} +}; + +#if (defined(__APPLE__) && TARGET_OS_MAC && !TARGET_OS_IPHONE) +/** The size of the screen shot to the screen or window. + */ +struct SIZE { + /** The width of the screen shot. + */ + int width; + /** The width of the screen shot. + */ + int height; + + SIZE() : width(0), height(0) {} + SIZE(int ww, int hh) : width(ww), height(hh) {} +}; +#endif + +#if defined(_WIN32) || (defined(__APPLE__) && TARGET_OS_MAC && !TARGET_OS_IPHONE) +/** + * The image content of the thumbnail or icon. + * @note The default image is in the RGBA format. If you need to use another format, you need to convert the image on + * your own. + */ +struct ThumbImageBuffer { + /** + * The buffer of the thumbnail ot icon. + */ + const char* buffer; + /** + * The buffer length of the thumbnail or icon, in bytes. + */ + unsigned int length; + /** + * The actual width (px) of the thumbnail or icon. + */ + unsigned int width; + /** + * The actual height (px) of the thumbnail or icon. + */ + unsigned int height; + ThumbImageBuffer() : buffer(nullptr), length(0), width(0), height(0) {} +}; +/** + * The type of the shared target. Set in ScreenCaptureSourceInfo. + */ +enum ScreenCaptureSourceType { + /** -1: Unknown type. */ + ScreenCaptureSourceType_Unknown = -1, + /** 0: The shared target is a window.*/ + ScreenCaptureSourceType_Window = 0, + /** 1: The shared target is a screen of a particular monitor.*/ + ScreenCaptureSourceType_Screen = 1, + /** 2: Reserved parameter.*/ + ScreenCaptureSourceType_Custom = 2, +}; +/** The information about the specified shareable window or screen. It is returned in IScreenCaptureSourceList. */ +struct ScreenCaptureSourceInfo { + /** + * The type of the shared target. See \ref agora::rtc::ScreenCaptureSourceType "ScreenCaptureSourceType". + */ + ScreenCaptureSourceType type; + /** + * The window ID for a window or the display ID for a screen. + */ + view_t sourceId; + /** + * The name of the window or screen. UTF-8 encoding. + */ + const char* sourceName; + /** + * The image content of the thumbnail. See ThumbImageBuffer. + */ + ThumbImageBuffer thumbImage; + /** + * The image content of the icon. See ThumbImageBuffer. + */ + ThumbImageBuffer iconImage; + /** + * The process to which the window belongs. UTF-8 encoding. + */ + const char* processPath; + /** + * The title of the window. UTF-8 encoding. + */ + const char* sourceTitle; + /** + * Determines whether the screen is the primary display: + * - true: The screen is the primary display. + * - false: The screen is not the primary display. + */ + bool primaryMonitor; + bool isOccluded; + /** + * The relative position of the shared region to the screen space (A virtual space include all the screens). See Rectangle. + */ + Rectangle position; +#if defined(_WIN32) + /** + * Determines whether the window is minimized. + */ + bool minimizeWindow; + /** + * The display ID to the window of interest. + * If the window intersects one or more display monitor rectangles, the return value is an valid + * ID to the display monitor that has the largest area of intersection with the window, Otherwise + * the return value is -2. + */ + view_t sourceDisplayId; + ScreenCaptureSourceInfo() : type(ScreenCaptureSourceType_Unknown), sourceId(nullptr), sourceName(nullptr), + processPath(nullptr), sourceTitle(nullptr), primaryMonitor(false), isOccluded(false), minimizeWindow(false), sourceDisplayId((view_t)-2) {} +#else + ScreenCaptureSourceInfo() : type(ScreenCaptureSourceType_Unknown), sourceId(nullptr), sourceName(nullptr), processPath(nullptr), sourceTitle(nullptr), primaryMonitor(false), isOccluded(false) {} +#endif +}; +/** + * The IScreenCaptureSourceList class. This class is returned in the getScreenCaptureSources method. + */ +class IScreenCaptureSourceList { + protected: + virtual ~IScreenCaptureSourceList(){}; + + public: + /** + * Gets the number of shareable cpp and screens. + * + * @return The number of shareable cpp and screens. + */ + virtual unsigned int getCount() = 0; + /** + * Gets information about the specified shareable window or screen. + * + * After you get IScreenCaptureSourceList, you can pass in the index value of the specified shareable window or + * screen to get information about that window or screen from ScreenCaptureSourceInfo. + * + * @param index The index of the specified shareable window or screen. The value range is [0, getCount()). + * @return ScreenCaptureSourceInfo The information of the specified window or screen. + */ + virtual ScreenCaptureSourceInfo getSourceInfo(unsigned int index) = 0; + /** + * Releases IScreenCaptureSourceList. + * + * After you get the list of shareable cpp and screens, to avoid memory leaks, call this method to release + * IScreenCaptureSourceList instead of deleting IScreenCaptureSourceList directly. + */ + virtual void release() = 0; +}; +#endif // _WIN32 || (__APPLE__ && !TARGET_OS_IPHONE && TARGET_OS_MAC) +/** + * The advanced options for audio. + */ +struct AdvancedAudioOptions { + /** + * Audio processing channels, only support 1 or 2. + */ + Optional audioProcessingChannels; + + AdvancedAudioOptions() {} + ~AdvancedAudioOptions() {} +}; + +struct ImageTrackOptions { + const char* imageUrl; + int fps; + VIDEO_MIRROR_MODE_TYPE mirrorMode; + ImageTrackOptions() : imageUrl(NULL), fps(1), mirrorMode(VIDEO_MIRROR_MODE_DISABLED) {} +}; + +/** + * The channel media options. + * + * Agora supports publishing multiple audio streams and one video stream at the same time and in the same RtcConnection. + * For example, `publishAudioTrack`, `publishCustomAudioTrack` and `publishMediaPlayerAudioTrack` can be true at the same time; + * but only one of `publishCameraTrack`, `publishScreenTrack`, `publishCustomVideoTrack`, and `publishEncodedVideoTrack` can be + * true at the same time. + */ +struct ChannelMediaOptions { + /** + * Whether to publish the video of the camera track. + * - `true`: (Default) Publish the video track of the camera capturer. + * - `false`: Do not publish the video track of the camera capturer. + */ + Optional publishCameraTrack; + /** + * Whether to publish the video of the secondary camera track. + * - `true`: Publish the video track of the secondary camera capturer. + * - `false`: (Default) Do not publish the video track of the secondary camera capturer. + */ + Optional publishSecondaryCameraTrack; + /** + * Whether to publish the video of the third camera track. + * - `true`: Publish the video track of the third camera capturer. + * - `false`: (Default) Do not publish the video track of the third camera capturer. + */ + Optional publishThirdCameraTrack; + /** + * Whether to publish the video of the fourth camera track. + * - `true`: Publish the video track of the fourth camera capturer. + * - `false`: (Default) Do not publish the video track of the fourth camera capturer. + */ + Optional publishFourthCameraTrack; + /** + * Whether to publish the recorded audio. + * - `true`: (Default) Publish the recorded audio. + * - `false`: Do not publish the recorded audio. + */ + Optional publishMicrophoneTrack; + + #if defined(__ANDROID__) || (defined(TARGET_OS_IPHONE) && TARGET_OS_IPHONE) + /** + * Whether to publish the video track of the screen capturer: + * - `true`: Publish the video track of the screen capture. + * - `false`: (Default) Do not publish the video track of the screen capture. + */ + Optional publishScreenCaptureVideo; + /** + * Whether to publish the audio track of the screen capturer: + * - `true`: Publish the video audio of the screen capturer. + * - `false`: (Default) Do not publish the audio track of the screen capturer. + */ + Optional publishScreenCaptureAudio; + #else + /** + * Whether to publish the captured video from the screen: + * - `true`: PPublish the captured video from the screen. + * - `false`: (Default) Do not publish the captured video from the screen. + */ + Optional publishScreenTrack; + /** + * Whether to publish the captured video from the secondary screen: + * - true: Publish the captured video from the secondary screen. + * - false: (Default) Do not publish the captured video from the secondary screen. + */ + Optional publishSecondaryScreenTrack; + /** + * Whether to publish the captured video from the third screen: + * - true: Publish the captured video from the third screen. + * - false: (Default) Do not publish the captured video from the third screen. + */ + Optional publishThirdScreenTrack; + /** + * Whether to publish the captured video from the fourth screen: + * - true: Publish the captured video from the fourth screen. + * - false: (Default) Do not publish the captured video from the fourth screen. + */ + Optional publishFourthScreenTrack; + #endif + + /** + * Whether to publish the captured audio from a custom source: + * - true: Publish the captured audio from a custom source. + * - false: (Default) Do not publish the captured audio from the custom source. + */ + Optional publishCustomAudioTrack; + /** + * The custom audio track id. The default value is 0. + */ + Optional publishCustomAudioTrackId; + /** + * Whether to publish the captured video from a custom source: + * - `true`: Publish the captured video from a custom source. + * - `false`: (Default) Do not publish the captured video from the custom source. + */ + Optional publishCustomVideoTrack; + /** + * Whether to publish the encoded video: + * - `true`: Publish the encoded video. + * - `false`: (Default) Do not publish the encoded video. + */ + Optional publishEncodedVideoTrack; + /** + * Whether to publish the audio from the media player: + * - `true`: Publish the audio from the media player. + * - `false`: (Default) Do not publish the audio from the media player. + */ + Optional publishMediaPlayerAudioTrack; + /** + * Whether to publish the video from the media player: + * - `true`: Publish the video from the media player. + * - `false`: (Default) Do not publish the video from the media player. + */ + Optional publishMediaPlayerVideoTrack; + /** + * Whether to publish the local transcoded video track. + * - `true`: Publish the video track of local transcoded video track. + * - `false`: (Default) Do not publish the local transcoded video track. + */ + Optional publishTranscodedVideoTrack; + /** + * Whether to publish the local mixed track. + * - `true`: Publish the audio track of local mixed track. + * - `false`: (Default) Do not publish the local mixed track. + */ + Optional publishMixedAudioTrack; + /** + * Whether to automatically subscribe to all remote audio streams when the user joins a channel: + * - `true`: (Default) Subscribe to all remote audio streams. + * - `false`: Do not subscribe to any remote audio stream. + */ + Optional autoSubscribeAudio; + /** + * Whether to subscribe to all remote video streams when the user joins the channel: + * - `true`: (Default) Subscribe to all remote video streams. + * - `false`: Do not subscribe to any remote video stream. + */ + Optional autoSubscribeVideo; + /** + * Whether to enable audio capturing or playback. + * - `true`: (Default) Enable audio capturing and playback. + * - `false`: Do not enable audio capturing or playback. + */ + Optional enableAudioRecordingOrPlayout; + /** + * The ID of the media player to be published. The default value is 0. + */ + Optional publishMediaPlayerId; + /** + * The client role type. See \ref CLIENT_ROLE_TYPE. + * Default is CLIENT_ROLE_AUDIENCE. + */ + Optional clientRoleType; + /** + * The audience latency level type. See #AUDIENCE_LATENCY_LEVEL_TYPE. + */ + Optional audienceLatencyLevel; + /** + * The default video stream type. See \ref VIDEO_STREAM_TYPE. + * Default is VIDEO_STREAM_HIGH. + */ + Optional defaultVideoStreamType; + /** + * The channel profile. See \ref CHANNEL_PROFILE_TYPE. + * Default is CHANNEL_PROFILE_LIVE_BROADCASTING. + */ + Optional channelProfile; + /** + * The delay in ms for sending audio frames. This is used for explicit control of A/V sync. + * To switch off the delay, set the value to zero. + */ + Optional audioDelayMs; + /** + * The delay in ms for sending media player audio frames. This is used for explicit control of A/V sync. + * To switch off the delay, set the value to zero. + */ + Optional mediaPlayerAudioDelayMs; + /** + * (Optional) The token generated on your server for authentication. + * @note + * - This parameter takes effect only when calling `updateChannelMediaOptions` or `updateChannelMediaOptionsEx`. + * - Ensure that the App ID, channel name, and user name used for creating the token are the same ones as those + * used by the initialize method for initializing the RTC engine, and those used by the `joinChannel [2/2]` + * and `joinChannelEx` methods for joining the channel. + */ + Optional token; + /** + * Whether to enable media packet encryption: + * - `true`: Yes. + * - `false`: (Default) No. + * + * @note This parameter is ignored when calling `updateChannelMediaOptions`. + */ + Optional enableBuiltInMediaEncryption; + /** + * Whether to publish the sound of the rhythm player to remote users: + * - `true`: (Default) Publish the sound of the rhythm player. + * - `false`: Do not publish the sound of the rhythm player. + */ + Optional publishRhythmPlayerTrack; + /** + * Whether the user is an interactive audience member in the channel. + * - `true`: Enable low lentancy and smooth video when joining as an audience. + * - `false`: (Default) Use default settings for audience role. + * @note This mode is only used for audience. In PK mode, client might join one channel as broadcaster, and join + * another channel as interactive audience to achieve low lentancy and smooth video from remote user. + */ + Optional isInteractiveAudience; + /** + * The custom video track id which will used to publish or preview. + * You can get the VideoTrackId after calling createCustomVideoTrack() of IRtcEngine. + */ + Optional customVideoTrackId; + /** + * Whether local audio stream can be filtered. + * - `true`: (Default) Can be filtered when audio level is low. + * - `false`: Do not Filter this audio stream. + */ + Optional isAudioFilterable; + + ChannelMediaOptions() {} + ~ChannelMediaOptions() {} + + void SetAll(const ChannelMediaOptions& change) { +#define SET_FROM(X) SetFrom(&X, change.X) + + SET_FROM(publishCameraTrack); + SET_FROM(publishSecondaryCameraTrack); + SET_FROM(publishThirdCameraTrack); + SET_FROM(publishFourthCameraTrack); + SET_FROM(publishMicrophoneTrack); +#if defined(__ANDROID__) || (defined(TARGET_OS_IPHONE) && TARGET_OS_IPHONE) + SET_FROM(publishScreenCaptureVideo); + SET_FROM(publishScreenCaptureAudio); +#else + SET_FROM(publishScreenTrack); + SET_FROM(publishSecondaryScreenTrack); + SET_FROM(publishThirdScreenTrack); + SET_FROM(publishFourthScreenTrack); +#endif + SET_FROM(publishTranscodedVideoTrack); + SET_FROM(publishMixedAudioTrack); + SET_FROM(publishCustomAudioTrack); + SET_FROM(publishCustomAudioTrackId); + SET_FROM(publishCustomVideoTrack); + SET_FROM(publishEncodedVideoTrack); + SET_FROM(publishMediaPlayerAudioTrack); + SET_FROM(publishMediaPlayerVideoTrack); + SET_FROM(autoSubscribeAudio); + SET_FROM(autoSubscribeVideo); + SET_FROM(publishMediaPlayerId); + SET_FROM(enableAudioRecordingOrPlayout); + SET_FROM(clientRoleType); + SET_FROM(audienceLatencyLevel); + SET_FROM(defaultVideoStreamType); + SET_FROM(channelProfile); + SET_FROM(audioDelayMs); + SET_FROM(mediaPlayerAudioDelayMs); + SET_FROM(token); + SET_FROM(enableBuiltInMediaEncryption); + SET_FROM(publishRhythmPlayerTrack); + SET_FROM(customVideoTrackId); + SET_FROM(isAudioFilterable); + SET_FROM(isInteractiveAudience); +#undef SET_FROM + } + + bool operator==(const ChannelMediaOptions& o) const { +#define BEGIN_COMPARE() bool b = true +#define ADD_COMPARE(X) b = (b && (X == o.X)) +#define END_COMPARE() + + BEGIN_COMPARE(); + ADD_COMPARE(publishCameraTrack); + ADD_COMPARE(publishSecondaryCameraTrack); + ADD_COMPARE(publishThirdCameraTrack); + ADD_COMPARE(publishFourthCameraTrack); + ADD_COMPARE(publishMicrophoneTrack); +#if defined(__ANDROID__) || (defined(TARGET_OS_IPHONE) && TARGET_OS_IPHONE) + ADD_COMPARE(publishScreenCaptureVideo); + ADD_COMPARE(publishScreenCaptureAudio); +#else + ADD_COMPARE(publishScreenTrack); + ADD_COMPARE(publishSecondaryScreenTrack); + ADD_COMPARE(publishThirdScreenTrack); + ADD_COMPARE(publishFourthScreenTrack); +#endif + ADD_COMPARE(publishTranscodedVideoTrack); + ADD_COMPARE(publishMixedAudioTrack); + ADD_COMPARE(publishCustomAudioTrack); + ADD_COMPARE(publishCustomAudioTrackId); + ADD_COMPARE(publishCustomVideoTrack); + ADD_COMPARE(publishEncodedVideoTrack); + ADD_COMPARE(publishMediaPlayerAudioTrack); + ADD_COMPARE(publishMediaPlayerVideoTrack); + ADD_COMPARE(autoSubscribeAudio); + ADD_COMPARE(autoSubscribeVideo); + ADD_COMPARE(publishMediaPlayerId); + ADD_COMPARE(enableAudioRecordingOrPlayout); + ADD_COMPARE(clientRoleType); + ADD_COMPARE(audienceLatencyLevel); + ADD_COMPARE(defaultVideoStreamType); + ADD_COMPARE(channelProfile); + ADD_COMPARE(audioDelayMs); + ADD_COMPARE(mediaPlayerAudioDelayMs); + ADD_COMPARE(token); + ADD_COMPARE(enableBuiltInMediaEncryption); + ADD_COMPARE(publishRhythmPlayerTrack); + ADD_COMPARE(customVideoTrackId); + ADD_COMPARE(isAudioFilterable); + ADD_COMPARE(isInteractiveAudience); + END_COMPARE(); + +#undef BEGIN_COMPARE +#undef ADD_COMPARE +#undef END_COMPARE + return b; + } + + ChannelMediaOptions& operator=(const ChannelMediaOptions& replace) { + if (this != &replace) { +#define REPLACE_BY(X) ReplaceBy(&X, replace.X) + + REPLACE_BY(publishCameraTrack); + REPLACE_BY(publishSecondaryCameraTrack); + REPLACE_BY(publishThirdCameraTrack); + REPLACE_BY(publishFourthCameraTrack); + REPLACE_BY(publishMicrophoneTrack); +#if defined(__ANDROID__) || (defined(TARGET_OS_IPHONE) && TARGET_OS_IPHONE) + REPLACE_BY(publishScreenCaptureVideo); + REPLACE_BY(publishScreenCaptureAudio); +#else + REPLACE_BY(publishScreenTrack); + REPLACE_BY(publishSecondaryScreenTrack); + REPLACE_BY(publishThirdScreenTrack); + REPLACE_BY(publishFourthScreenTrack); +#endif + REPLACE_BY(publishTranscodedVideoTrack); + REPLACE_BY(publishMixedAudioTrack); + REPLACE_BY(publishCustomAudioTrack); + REPLACE_BY(publishCustomAudioTrackId); + REPLACE_BY(publishCustomVideoTrack); + REPLACE_BY(publishEncodedVideoTrack); + REPLACE_BY(publishMediaPlayerAudioTrack); + REPLACE_BY(publishMediaPlayerVideoTrack); + REPLACE_BY(autoSubscribeAudio); + REPLACE_BY(autoSubscribeVideo); + REPLACE_BY(publishMediaPlayerId); + REPLACE_BY(enableAudioRecordingOrPlayout); + REPLACE_BY(clientRoleType); + REPLACE_BY(audienceLatencyLevel); + REPLACE_BY(defaultVideoStreamType); + REPLACE_BY(channelProfile); + REPLACE_BY(audioDelayMs); + REPLACE_BY(mediaPlayerAudioDelayMs); + REPLACE_BY(token); + REPLACE_BY(enableBuiltInMediaEncryption); + REPLACE_BY(publishRhythmPlayerTrack); + REPLACE_BY(customVideoTrackId); + REPLACE_BY(isAudioFilterable); + REPLACE_BY(isInteractiveAudience); +#undef REPLACE_BY + } + return *this; + } +}; + +enum PROXY_TYPE { + /** 0: Do not use the cloud proxy. + */ + NONE_PROXY_TYPE = 0, + /** 1: The cloud proxy for the UDP protocol. + */ + UDP_PROXY_TYPE = 1, + /** 2: The cloud proxy for the TCP (encrypted) protocol. + */ + TCP_PROXY_TYPE = 2, + /** 3: The local proxy. + */ + LOCAL_PROXY_TYPE = 3, + /** 4: auto fallback to tcp cloud proxy + */ + TCP_PROXY_AUTO_FALLBACK_TYPE = 4, + /** 5: The http proxy. + */ + HTTP_PROXY_TYPE = 5, + /** 6: The https proxy. + */ + HTTPS_PROXY_TYPE = 6, +}; + +enum FeatureType { + VIDEO_VIRTUAL_BACKGROUND = 1, + VIDEO_BEAUTY_EFFECT = 2, +}; + +/** + * The options for leaving a channel. + */ +struct LeaveChannelOptions { + /** + * Whether to stop playing and mixing the music file when a user leaves the channel. + * - `true`: (Default) Stop playing and mixing the music file. + * - `false`: Do not stop playing and mixing the music file. + */ + bool stopAudioMixing; + /** + * Whether to stop playing all audio effects when a user leaves the channel. + * - `true`: (Default) Stop playing all audio effects. + * - `false`: Do not stop playing any audio effect. + */ + bool stopAllEffect; + /** + * Whether to stop microphone recording when a user leaves the channel. + * - `true`: (Default) Stop microphone recording. + * - `false`: Do not stop microphone recording. + */ + bool stopMicrophoneRecording; + + LeaveChannelOptions() : stopAudioMixing(true), stopAllEffect(true), stopMicrophoneRecording(true) {} +}; + +/** + * The IRtcEngineEventHandler class. + * + * The SDK uses this class to send callback event notifications to the app, and the app inherits + * the methods in this class to retrieve these event notifications. + * + * All methods in this class have their default (empty) implementations, and the app can inherit + * only some of the required events instead of all. In the callback methods, the app should avoid + * time-consuming tasks or calling blocking APIs, otherwise the SDK may not work properly. + */ +class IRtcEngineEventHandler { + public: + virtual ~IRtcEngineEventHandler() {} + + virtual const char* eventHandlerType() const { return "event_handler"; } + + /** + * Occurs when a user joins a channel. + * + * This callback notifies the application that a user joins a specified channel. + * + * @param channel The channel name. + * @param uid The ID of the user who joins the channel. + * @param elapsed The time elapsed (ms) from the local user calling joinChannel until the SDK triggers this callback. + */ + virtual void onJoinChannelSuccess(const char* channel, uid_t uid, int elapsed) { + (void)channel; + (void)uid; + (void)elapsed; + } + + /** + * Occurs when a user rejoins the channel. + * + * When a user loses connection with the server because of network problems, the SDK automatically tries to reconnect + * and triggers this callback upon reconnection. + * + * @param channel The channel name. + * @param uid The ID of the user who rejoins the channel. + * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until this callback is triggered. + */ + virtual void onRejoinChannelSuccess(const char* channel, uid_t uid, int elapsed) { + (void)channel; + (void)uid; + (void)elapsed; + } + + /** Occurs when join success after calling \ref IRtcEngine::setLocalAccessPoint "setLocalAccessPoint" or \ref IRtcEngine::setCloudProxy "setCloudProxy" + @param channel Channel name. + @param uid User ID of the user joining the channel. + @param proxyType type of proxy agora sdk connected, proxyType will be NONE_PROXY_TYPE if not connected to proxy(fallback). + @param localProxyIp local proxy ip. if not join local proxy, it will be "". + @param elapsed Time elapsed (ms) from the user calling the \ref IRtcEngine::joinChannel "joinChannel" method until the SDK triggers this callback. + */ + virtual void onProxyConnected(const char* channel, uid_t uid, PROXY_TYPE proxyType, const char* localProxyIp, int elapsed) { + (void)channel; + (void)uid; + (void)proxyType; + (void)localProxyIp; + (void)elapsed; + } + + /** An error occurs during the SDK runtime. + + @param err The error code: #ERROR_CODE_TYPE. + @param msg The detailed error message. + */ + virtual void onError(int err, const char* msg) { + (void)err; + (void)msg; + } + + /** Reports the statistics of the audio stream from each remote + user/broadcaster. + + @deprecated This callback is deprecated. Use onRemoteAudioStats instead. + + The SDK triggers this callback once every two seconds to report the audio + quality of each remote user/host sending an audio stream. If a channel has + multiple remote users/hosts sending audio streams, the SDK triggers this + callback as many times. + + @param uid The user ID of the remote user sending the audio stream. + @param quality The audio quality of the user: #QUALITY_TYPE + @param delay The network delay (ms) from the sender to the receiver, including the delay caused by audio sampling pre-processing, network transmission, and network jitter buffering. + @param lost The audio packet loss rate (%) from the sender to the receiver. + */ + virtual void onAudioQuality(uid_t uid, int quality, unsigned short delay, unsigned short lost) __deprecated { + (void)uid; + (void)quality; + (void)delay; + (void)lost; + } + + /** Reports the result of the last-mile network probe result. + * + * The SDK triggers this callback within 30 seconds after the app calls the `startLastmileProbeTest` method. + * @param result The uplink and downlink last-mile network probe test result: LastmileProbeResult. + */ + virtual void onLastmileProbeResult(const LastmileProbeResult& result) { + (void)result; + } + + /** + * Reports the volume information of users. + * + * By default, this callback is disabled. You can enable it by calling `enableAudioVolumeIndication`. Once this + * callback is enabled and users send streams in the channel, the SDK triggers the `onAudioVolumeIndication` + * callback at the time interval set in `enableAudioVolumeIndication`. The SDK triggers two independent + * `onAudioVolumeIndication` callbacks simultaneously, which separately report the volume information of the + * local user who sends a stream and the remote users (up to three) whose instantaneous volume is the highest. + * + * @note After you enable this callback, calling muteLocalAudioStream affects the SDK's behavior as follows: + * - If the local user stops publishing the audio stream, the SDK stops triggering the local user's callback. + * - 20 seconds after a remote user whose volume is one of the three highest stops publishing the audio stream, + * the callback excludes this user's information; 20 seconds after all remote users stop publishing audio streams, + * the SDK stops triggering the callback for remote users. + * + * @param speakers The volume information of the users, see AudioVolumeInfo. An empty `speakers` array in the + * callback indicates that no remote user is in the channel or sending a stream at the moment. + * @param speakerNumber The total number of speakers. + * - In the local user's callback, when the local user sends a stream, `speakerNumber` is 1. + * - In the callback for remote users, the value range of speakerNumber is [0,3]. If the number of remote users who + * send streams is greater than or equal to three, the value of `speakerNumber` is 3. + * @param totalVolume The volume of the speaker. The value ranges between 0 (lowest volume) and 255 (highest volume). + * - In the local user's callback, `totalVolume` is the volume of the local user who sends a stream. + * - In the remote users' callback, `totalVolume` is the sum of all remote users (up to three) whose instantaneous + * volume is the highest. If the user calls `startAudioMixing`, `totalVolume` is the volume after audio mixing. + */ + virtual void onAudioVolumeIndication(const AudioVolumeInfo* speakers, unsigned int speakerNumber, + int totalVolume) { + (void)speakers; + (void)speakerNumber; + (void)totalVolume; + } + + /** + * Occurs when a user leaves a channel. + * + * This callback notifies the app that the user leaves the channel by calling `leaveChannel`. From this callback, + * the app can get information such as the call duration and quality statistics. + * + * @param stats The statistics on the call: RtcStats. + */ + virtual void onLeaveChannel(const RtcStats& stats) { (void)stats; } + + /** + * Reports the statistics of the current call. + * + * The SDK triggers this callback once every two seconds after the user joins the channel. + * + * @param stats The statistics of the current call: RtcStats. + */ + virtual void onRtcStats(const RtcStats& stats) { (void)stats; } + + /** Occurs when the audio device state changes. + + This callback notifies the application that the system's audio device state + is changed. For example, a headset is unplugged from the device. + + @param deviceId The device ID. + @param deviceType The device type: #MEDIA_DEVICE_TYPE. + @param deviceState The device state: + - On macOS: + - 0: The device is ready for use. + - 8: The device is not connected. + - On Windows: #MEDIA_DEVICE_STATE_TYPE. + */ + virtual void onAudioDeviceStateChanged(const char* deviceId, int deviceType, int deviceState) { + (void)deviceId; + (void)deviceType; + (void)deviceState; + } + + /** + * @brief Reports current AudioMixing progress. + * + * The callback occurs once every one second during the playback and reports the current playback progress. + * @param position Current AudioMixing progress (millisecond). + */ + virtual void onAudioMixingPositionChanged(int64_t position) {} + + /** Occurs when the audio mixing file playback finishes. + @deprecated This method is deprecated, use onAudioMixingStateChanged instead. + + After you call startAudioMixing to play a local music file, this callback occurs when the playback finishes. + If the startAudioMixing method call fails, the SDK returns the error code 701. + */ + virtual void onAudioMixingFinished() __deprecated {} + + /** + * Occurs when the playback of the local audio effect file finishes. + * + * This callback occurs when the local audio effect file finishes playing. + * + * @param soundId The audio effect ID. The ID of each audio effect file is unique. + */ + virtual void onAudioEffectFinished(int soundId) {} + + /** Occurs when the video device state changes. + + This callback notifies the application that the system's video device state + is changed. + + @param deviceId Pointer to the device ID. + @param deviceType Device type: #MEDIA_DEVICE_TYPE. + @param deviceState Device state: #MEDIA_DEVICE_STATE_TYPE. + */ + virtual void onVideoDeviceStateChanged(const char* deviceId, int deviceType, int deviceState) { + (void)deviceId; + (void)deviceType; + (void)deviceState; + } + + /** + * Reports the last mile network quality of each user in the channel. + * + * This callback reports the last mile network conditions of each user in the channel. Last mile refers to the + * connection between the local device and Agora's edge server. + * + * The SDK triggers this callback once every two seconds. If a channel includes multiple users, the SDK triggers + * this callback as many times. + * + * @note `txQuality` is UNKNOWN when the user is not sending a stream; `rxQuality` is UNKNOWN when the user is not + * receiving a stream. + * + * @param uid The user ID. The network quality of the user with this user ID is reported. + * @param txQuality Uplink network quality rating of the user in terms of the transmission bit rate, packet loss rate, + * average RTT (Round-Trip Time) and jitter of the uplink network. This parameter is a quality rating helping you + * understand how well the current uplink network conditions can support the selected video encoder configuration. + * For example, a 1000 Kbps uplink network may be adequate for video frames with a resolution of 640 脳 480 and a frame + * rate of 15 fps in the LIVE_BROADCASTING profile, but may be inadequate for resolutions higher than 1280 脳 720. + * See #QUALITY_TYPE. + * @param rxQuality Downlink network quality rating of the user in terms of packet loss rate, average RTT, and jitter + * of the downlink network. See #QUALITY_TYPE. + */ + virtual void onNetworkQuality(uid_t uid, int txQuality, int rxQuality) { + (void)uid; + (void)txQuality; + (void)rxQuality; + } + + /** + * Occurs when intra request from remote user is received. + * + * This callback is triggered once remote user needs one Key frame. + * + */ + virtual void onIntraRequestReceived() {} + + /** + * Occurs when uplink network info is updated. + * + * The SDK triggers this callback when the uplink network information changes. + * + * @note This callback only applies to scenarios where you push externally encoded + * video data in H.264 format to the SDK. + * + * @param info The uplink network information. See UplinkNetworkInfo. + */ + virtual void onUplinkNetworkInfoUpdated(const UplinkNetworkInfo& info) { + (void)info; + } + + /** + * Occurs when downlink network info is updated. + * + * This callback is used for notifying user to switch major/minor stream if needed. + * + * @param info The downlink network info collections. + */ + virtual void onDownlinkNetworkInfoUpdated(const DownlinkNetworkInfo& info) { + (void)info; + } + + /** + * Reports the last-mile network quality of the local user. + * + * This callback reports the last-mile network conditions of the local user before the user joins + * the channel. Last mile refers to the connection between the local device and Agora's edge server. + * + * When the user is not in a channel and the last-mile network test is enabled + * (by calling `startLastmileProbeTest`), this callback function is triggered + * to update the app on the network connection quality of the local user. + * + * @param quality The last mile network quality. See #QUALITY_TYPE. + */ + virtual void onLastmileQuality(int quality) { (void)quality; } + + /** Occurs when the first local video frame is rendered on the local video view. + * + * @param source The video source: #VIDEO_SOURCE_TYPE. + * @param width The width (px) of the first local video frame. + * @param height The height (px) of the first local video frame. + * @param elapsed Time elapsed (ms) from the local user calling the `joinChannel` + * method until the SDK triggers this callback. If you call the `startPreview` method before calling + * the `joinChannel` method, then `elapsed` is the time elapsed from calling the + * `startPreview` method until the SDK triggers this callback. + */ + virtual void onFirstLocalVideoFrame(VIDEO_SOURCE_TYPE source, int width, int height, int elapsed) { + (void)source; + (void)width; + (void)height; + (void)elapsed; + } + + /** Occurs when the first local video frame is published. + * The SDK triggers this callback under one of the following circumstances: + * - The local client enables the video module and calls `joinChannel` successfully. + * - The local client calls `muteLocalVideoStream(true)` and muteLocalVideoStream(false) in sequence. + * - The local client calls `disableVideo` and `enableVideo` in sequence. + * - The local client calls `pushVideoFrame` to successfully push the video frame to the SDK. + * @param source The video source type. + * @param elapsed The time elapsed (ms) from the local user calling joinChannel` to the SDK triggers + * this callback. + */ + virtual void onFirstLocalVideoFramePublished(VIDEO_SOURCE_TYPE source, int elapsed) { + (void)source; + (void)elapsed; + } + + /** Occurs when the first remote video frame is received and decoded. + + The SDK triggers this callback under one of the following circumstances: + - The remote user joins the channel and sends the video stream. + - The remote user stops sending the video stream and re-sends it after 15 seconds. Reasons for such an interruption include: + - The remote user leaves the channel. + - The remote user drops offline. + - The remote user calls `muteLocalVideoStream` to stop sending the video stream. + - The remote user calls `disableVideo` to disable video. + + @param uid The user ID of the remote user sending the video stream. + @param width The width (pixels) of the video stream. + @param height The height (pixels) of the video stream. + @param elapsed The time elapsed (ms) from the local user calling `joinChannel` + until the SDK triggers this callback. + */ + virtual void onFirstRemoteVideoDecoded(uid_t uid, int width, int height, int elapsed) __deprecated { + (void)uid; + (void)width; + (void)height; + (void)elapsed; + } + + /** + * Occurs when the local or remote video size or rotation has changed. + * @param sourceType The video source type: #VIDEO_SOURCE_TYPE. + * @param uid The user ID. 0 indicates the local user. + * @param width The new width (pixels) of the video. + * @param height The new height (pixels) of the video. + * @param rotation The rotation information of the video. + */ + virtual void onVideoSizeChanged(VIDEO_SOURCE_TYPE sourceType, uid_t uid, int width, int height, int rotation) { + (void)uid; + (void)width; + (void)height; + (void)rotation; + } + + /** Occurs when the local video stream state changes. + * + * When the state of the local video stream changes (including the state of the video capture and + * encoding), the SDK triggers this callback to report the current state. This callback indicates + * the state of the local video stream, including camera capturing and video encoding, and allows + * you to troubleshoot issues when exceptions occur. + * + * The SDK triggers the onLocalVideoStateChanged callback with the state code of `LOCAL_VIDEO_STREAM_STATE_FAILED` + * and error code of `LOCAL_VIDEO_STREAM_REASON_CAPTURE_FAILURE` in the following situations: + * - The app switches to the background, and the system gets the camera resource. + * - The camera starts normally, but does not output video for four consecutive seconds. + * + * When the camera outputs the captured video frames, if the video frames are the same for 15 + * consecutive frames, the SDK triggers the `onLocalVideoStateChanged` callback with the state code + * of `LOCAL_VIDEO_STREAM_STATE_CAPTURING` and error code of `LOCAL_VIDEO_STREAM_REASON_CAPTURE_FAILURE`. + * Note that the video frame duplication detection is only available for video frames with a resolution + * greater than 200 脳 200, a frame rate greater than or equal to 10 fps, and a bitrate less than 20 Kbps. + * + * @note For some device models, the SDK does not trigger this callback when the state of the local + * video changes while the local video capturing device is in use, so you have to make your own + * timeout judgment. + * + * @param source The video source type: #VIDEO_SOURCE_TYPE. + * @param state The state of the local video. See #LOCAL_VIDEO_STREAM_STATE. + * @param reason The detailed error information. See #LOCAL_VIDEO_STREAM_REASON. + */ + virtual void onLocalVideoStateChanged(VIDEO_SOURCE_TYPE source, LOCAL_VIDEO_STREAM_STATE state, LOCAL_VIDEO_STREAM_REASON reason) { + (void)source; + (void)state; + (void)reason; + } + + /** + * Occurs when the remote video state changes. + * + * @note This callback does not work properly when the number of users (in the voice/video call + * channel) or hosts (in the live streaming channel) in the channel exceeds 17. + * + * @param uid The ID of the user whose video state has changed. + * @param state The remote video state: #REMOTE_VIDEO_STATE. + * @param reason The reason of the remote video state change: #REMOTE_VIDEO_STATE_REASON. + * @param elapsed The time elapsed (ms) from the local client calling `joinChannel` until this callback is triggered. + */ + virtual void onRemoteVideoStateChanged(uid_t uid, REMOTE_VIDEO_STATE state, REMOTE_VIDEO_STATE_REASON reason, int elapsed) { + (void)uid; + (void)state; + (void)reason; + (void)elapsed; + } + + /** Occurs when the renderer receives the first frame of the remote video. + * + * @param uid The user ID of the remote user sending the video stream. + * @param width The width (px) of the video frame. + * @param height The height (px) of the video stream. + * @param elapsed The time elapsed (ms) from the local user calling `joinChannel` until the SDK triggers this callback. + */ + virtual void onFirstRemoteVideoFrame(uid_t uid, int width, int height, int elapsed) { + (void)uid; + (void)width; + (void)height; + (void)elapsed; + } + + /** + * Occurs when a remote user or broadcaster joins the channel. + * + * - In the COMMUNICATION channel profile, this callback indicates that a remote user joins the channel. + * The SDK also triggers this callback to report the existing users in the channel when a user joins the + * channel. + * In the LIVE_BROADCASTING channel profile, this callback indicates that a host joins the channel. The + * SDK also triggers this callback to report the existing hosts in the channel when a host joins the + * channel. Agora recommends limiting the number of hosts to 17. + * + * The SDK triggers this callback under one of the following circumstances: + * - A remote user/host joins the channel by calling the `joinChannel` method. + * - A remote user switches the user role to the host after joining the channel. + * - A remote user/host rejoins the channel after a network interruption. + * + * @param uid The ID of the remote user or broadcaster joining the channel. + * @param elapsed The time elapsed (ms) from the local user calling `joinChannel` or `setClientRole` + * until this callback is triggered. + */ + virtual void onUserJoined(uid_t uid, int elapsed) { + (void)uid; + (void)elapsed; + } + + /** + * Occurs when a remote user or broadcaster goes offline. + * + * There are two reasons for a user to go offline: + * - Leave the channel: When the user leaves the channel, the user sends a goodbye message. When this + * message is received, the SDK determines that the user leaves the channel. + * - Drop offline: When no data packet of the user is received for a certain period of time, the SDK assumes + * that the user drops offline. A poor network connection may lead to false detection, so we recommend using + * the RTM SDK for reliable offline detection. + * - The user switches the user role from a broadcaster to an audience. + * + * @param uid The ID of the remote user or broadcaster who leaves the channel or drops offline. + * @param reason The reason why the remote user goes offline: #USER_OFFLINE_REASON_TYPE. + */ + virtual void onUserOffline(uid_t uid, USER_OFFLINE_REASON_TYPE reason) { + (void)uid; + (void)reason; + } + + /** Occurs when a remote user's audio stream playback pauses/resumes. + + The SDK triggers this callback when the remote user stops or resumes sending the audio stream by + calling the `muteLocalAudioStream` method. + + @note This callback can be inaccurate when the number of users (in the `COMMUNICATION` profile) or hosts (in the `LIVE_BROADCASTING` profile) in the channel exceeds 17. + + @param uid The user ID. + @param muted Whether the remote user's audio stream is muted/unmuted: + - true: Muted. + - false: Unmuted. + */ + virtual void onUserMuteAudio(uid_t uid, bool muted) { + (void)uid; + (void)muted; + } + + /** Occurs when a remote user pauses or resumes sending the video stream. + * + * When a remote user calls `muteLocalVideoStream` to stop or resume publishing the video stream, the + * SDK triggers this callback to report the state of the remote user's publishing stream to the local + * user. + + @note This callback is invalid when the number of users or broadacasters in a + channel exceeds 20. + + @param userId ID of the remote user. + @param muted Whether the remote user stops publishing the video stream: + - true: The remote user has paused sending the video stream. + - false: The remote user has resumed sending the video stream. + */ + virtual void onUserMuteVideo(uid_t uid, bool muted) { + (void)uid; + (void)muted; + } + + /** Occurs when a remote user enables or disables the video module. + + Once the video function is disabled, the users cannot see any video. + + The SDK triggers this callback when a remote user enables or disables the video module by calling the + `enableVideo` or `disableVideo` method. + + @param uid The ID of the remote user. + @param enabled Whether the video of the remote user is enabled: + - true: The remote user has enabled video. + - false: The remote user has disabled video. + */ + virtual void onUserEnableVideo(uid_t uid, bool enabled) { + (void)uid; + (void)enabled; + } + + /** + * Occurs when the remote user audio or video state is updated. + * @param uid The uid of the remote user. + * @param state The remote user's audio or video state: #REMOTE_USER_STATE. + */ + virtual void onUserStateChanged(uid_t uid, REMOTE_USER_STATE state) { + (void)uid; + (void)state; + } + + /** Occurs when a remote user enables or disables local video capturing. + + The SDK triggers this callback when the remote user resumes or stops capturing the video stream by + calling the `enableLocalVideo` method. + + @param uid The ID of the remote user. + @param enabled Whether the specified remote user enables/disables local video: + - `true`: The remote user has enabled local video capturing. + - `false`: The remote user has disabled local video capturing. + */ + virtual void onUserEnableLocalVideo(uid_t uid, bool enabled) __deprecated { + (void)uid; + (void)enabled; + } + + /** Reports the statistics of the audio stream from each remote user/host. + + The SDK triggers this callback once every two seconds for each remote user who is sending audio + streams. If a channel includes multiple remote users, the SDK triggers this callback as many times. + + @param stats Statistics of the received remote audio streams. See RemoteAudioStats. + */ + virtual void onRemoteAudioStats(const RemoteAudioStats& stats) { + (void)stats; + } + + /** Reports the statistics of the local audio stream. + * + * The SDK triggers this callback once every two seconds. + * + * @param stats The statistics of the local audio stream. + * See LocalAudioStats. + */ + virtual void onLocalAudioStats(const LocalAudioStats& stats) { + (void)stats; + } + + /** Reports the statistics of the local video stream. + * + * The SDK triggers this callback once every two seconds for each + * user/host. If there are multiple users/hosts in the channel, the SDK + * triggers this callback as many times. + * + * @note If you have called the `enableDualStreamMode` + * method, this callback reports the statistics of the high-video + * stream (high bitrate, and high-resolution video stream). + * + * @param source The video source type. See #VIDEO_SOURCE_TYPE. + * @param stats Statistics of the local video stream. See LocalVideoStats. + */ + virtual void onLocalVideoStats(VIDEO_SOURCE_TYPE source, const LocalVideoStats& stats) { + (void)source; + (void)stats; + } + + /** Reports the statistics of the video stream from each remote user/host. + * + * The SDK triggers this callback once every two seconds for each remote user. If a channel has + * multiple users/hosts sending video streams, the SDK triggers this callback as many times. + * + * @param stats Statistics of the remote video stream. See + * RemoteVideoStats. + */ + virtual void onRemoteVideoStats(const RemoteVideoStats& stats) { + (void)stats; + } + + /** + * Occurs when the camera turns on and is ready to capture the video. + * @deprecated Use `LOCAL_VIDEO_STREAM_STATE_CAPTURING(1)` in onLocalVideoStateChanged instead. + * This callback indicates that the camera has been successfully turned on and you can start to capture video. + */ + virtual void onCameraReady() __deprecated {} + + /** + * Occurs when the camera focus area changes. + * + * @note This method is for Andriod and iOS only. + * + * @param x The x coordinate of the changed camera focus area. + * @param y The y coordinate of the changed camera focus area. + * @param width The width of the changed camera focus area. + * @param height The height of the changed camera focus area. + */ + virtual void onCameraFocusAreaChanged(int x, int y, int width, int height) { + (void)x; + (void)y; + (void)width; + (void)height; + } + /** + * Occurs when the camera exposure area changes. + * + * @param x The x coordinate of the changed camera exposure area. + * @param y The y coordinate of the changed camera exposure area. + * @param width The width of the changed camera exposure area. + * @param height The height of the changed exposure area. + */ + virtual void onCameraExposureAreaChanged(int x, int y, int width, int height) { + (void)x; + (void)y; + (void)width; + (void)height; + } +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IOS) + /** + * Reports the face detection result of the local user. + * + * Once you enable face detection by calling enableFaceDetection(true), you can get the following + * information on the local user in real-time: + * - The width and height of the local video. + * - The position of the human face in the local view. + * - The distance between the human face and the screen. + * + * This value is based on the fitting calculation of the local video size and the position of the human face. + * + * @note + * - This callback is for Android and iOS only. + * - When it is detected that the face in front of the camera disappears, the callback will be + * triggered immediately. In the state of no face, the trigger frequency of the callback will be + * reduced to save power consumption on the local device. + * - The SDK stops triggering this callback when a human face is in close proximity to the screen. + * On Android, the value of `distance` reported in this callback may be slightly different from the + * actual distance. Therefore, Agora does not recommend using it for accurate calculation. + * + * @param imageWidth The width (px) of the video image captured by the local camera. + * @param imageHeight The height (px) of the video image captured by the local camera. + * @param vecRectangle A Rectangle array of length 'numFaces', which represents the position and size of the human face on the local video锛 + * - x: The x-coordinate (px) of the human face in the local view. Taking the top left corner of the view as the origin, the x-coordinate represents the horizontal position of the human face relative to the origin. + * - y: The y-coordinate (px) of the human face in the local view. Taking the top left corner of the view as the origin, the y-coordinate represents the vertical position of the human face relative to the origin. + * - width: The width (px) of the human face in the captured view. + * - height: The height (px) of the human face in the captured view. + * @param vecDistance An int array of length 'numFaces', which represents distance (cm) between the human face and the screen. + * @param numFaces The number of faces detected. If the value is 0, it means that no human face is detected. + */ + virtual void onFacePositionChanged(int imageWidth, int imageHeight, + const Rectangle* vecRectangle, const int* vecDistance, + int numFaces) { + (void) imageWidth; + (void) imageHeight; + (void) vecRectangle; + (void) vecDistance; + (void) numFaces; + } +#endif + /** + * Occurs when the video stops playing. + * @deprecated Use `LOCAL_VIDEO_STREAM_STATE_STOPPED(0)` in the onLocalVideoStateChanged callback instead. + * + * The app can use this callback to change the configuration of the view (for example, displaying + * other pictures in the view) after the video stops playing. + */ + virtual void onVideoStopped() __deprecated {} + + /** Occurs when the playback state of the music file changes. + * + * This callback occurs when the playback state of the music file changes, and reports the current state and error code. + + @param state The playback state of the music file. See #AUDIO_MIXING_STATE_TYPE. + @param reason The reason for the change of the music file playback state. See #AUDIO_MIXING_REASON_TYPE. + */ + virtual void onAudioMixingStateChanged(AUDIO_MIXING_STATE_TYPE state, AUDIO_MIXING_REASON_TYPE reason) { + (void)state; + (void)reason; + } + + /** Occurs when the state of the rhythm player changes. + When you call the \ref IRtcEngine::startRhythmPlayer "startRhythmPlayer" + method and the state of rhythm player changes, the SDK triggers this + callback. + + @param state The state code. See #RHYTHM_PLAYER_STATE_TYPE. + @param reason The error code. See #RHYTHM_PLAYER_REASON. + */ + virtual void onRhythmPlayerStateChanged(RHYTHM_PLAYER_STATE_TYPE state, RHYTHM_PLAYER_REASON reason) { + (void)state; + (void)reason; + } + + /** + * Occurs when the SDK cannot reconnect to the server 10 seconds after its connection to the server is + * interrupted. + * + * The SDK triggers this callback when it cannot connect to the server 10 seconds after calling + * `joinChannel`, regardless of whether it is in the channel or not. If the SDK fails to rejoin + * the channel 20 minutes after being disconnected from Agora's edge server, the SDK stops rejoining the channel. + */ + virtual void onConnectionLost() {} + + /** Occurs when the connection between the SDK and the server is interrupted. + * @deprecated Use `onConnectionStateChanged` instead. + + The SDK triggers this callback when it loses connection with the serer for more + than 4 seconds after the connection is established. After triggering this + callback, the SDK tries to reconnect to the server. If the reconnection fails + within a certain period (10 seconds by default), the onConnectionLost() + callback is triggered. If the SDK fails to rejoin the channel 20 minutes after + being disconnected from Agora's edge server, the SDK stops rejoining the channel. + + */ + virtual void onConnectionInterrupted() __deprecated {} + + /** Occurs when your connection is banned by the Agora Server. + * @deprecated Use `onConnectionStateChanged` instead. + */ + virtual void onConnectionBanned() __deprecated {} + + /** Occurs when the local user receives the data stream from the remote user. + * + * The SDK triggers this callback when the user receives the data stream that another user sends + * by calling the \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage" method. + * + * @param uid ID of the user who sends the data stream. + * @param streamId The ID of the stream data. + * @param data The data stream. + * @param length The length (byte) of the data stream. + * @param sentTs The time when the data stream sent. + */ + virtual void onStreamMessage(uid_t uid, int streamId, const char* data, size_t length, uint64_t sentTs) { + (void)uid; + (void)streamId; + (void)data; + (void)length; + (void)sentTs; + } + + /** Occurs when the local user does not receive the data stream from the remote user. + * + * The SDK triggers this callback when the user fails to receive the data stream that another user sends + * by calling the \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage" method. + * + * @param uid ID of the user who sends the data stream. + * @param streamId The ID of the stream data. + * @param code The error code. + * @param missed The number of lost messages. + * @param cached The number of incoming cached messages when the data stream is + * interrupted. + */ + virtual void onStreamMessageError(uid_t uid, int streamId, int code, int missed, int cached) { + (void)uid; + (void)streamId; + (void)code; + (void)missed; + (void)cached; + } + + /** + * Occurs when the token expires. + * + * When the token expires during a call, the SDK triggers this callback to remind the app to renew the token. + * + * Upon receiving this callback, generate a new token at your app server and call + * `joinChannel` to pass the new token to the SDK. + * + */ + virtual void onRequestToken() {} + + /** + * Occurs when the token will expire in 30 seconds. + * + * When the token is about to expire in 30 seconds, the SDK triggers this callback to remind the app to renew the token. + + * Upon receiving this callback, generate a new token at your app server and call + * \ref IRtcEngine::renewToken "renewToken" to pass the new Token to the SDK. + * + * + * @param token The token that will expire in 30 seconds. + */ + virtual void onTokenPrivilegeWillExpire(const char* token) { + (void)token; + } + + /** + * Occurs when connection license verification fails. + * + * You can know the reason accordding to error code + */ + virtual void onLicenseValidationFailure(LICENSE_ERROR_TYPE error) { + (void)error; + } + + /** Occurs when the first local audio frame is published. + * + * The SDK triggers this callback under one of the following circumstances: + * - The local client enables the audio module and calls `joinChannel` successfully. + * - The local client calls `muteLocalAudioStream(true)` and `muteLocalAudioStream(false)` in sequence. + * - The local client calls `disableAudio` and `enableAudio` in sequence. + * - The local client calls `pushAudioFrame` to successfully push the audio frame to the SDK. + * + * @param elapsed The time elapsed (ms) from the local user calling `joinChannel` to the SDK triggers this callback. + */ + virtual void onFirstLocalAudioFramePublished(int elapsed) { + (void)elapsed; + } + + /** + * Occurs when the SDK decodes the first remote audio frame for playback. + * + * @deprecated Use `onRemoteAudioStateChanged` instead. + * The SDK triggers this callback under one of the following circumstances: + * - The remote user joins the channel and sends the audio stream for the first time. + * - The remote user's audio is offline and then goes online to re-send audio. It means the local user cannot + * receive audio in 15 seconds. Reasons for such an interruption include: + * - The remote user leaves channel. + * - The remote user drops offline. + * - The remote user calls muteLocalAudioStream to stop sending the audio stream. + * - The remote user calls disableAudio to disable audio. + * @param uid User ID of the remote user sending the audio stream. + * @param elapsed The time elapsed (ms) from the loca user calling `joinChannel` + * until this callback is triggered. + */ + virtual void onFirstRemoteAudioDecoded(uid_t uid, int elapsed) __deprecated { + (void)uid; + (void)elapsed; + } + + /** Occurs when the SDK receives the first audio frame from a specific remote user. + * @deprecated Use `onRemoteAudioStateChanged` instead. + * + * @param uid ID of the remote user. + * @param elapsed The time elapsed (ms) from the loca user calling `joinChannel` + * until this callback is triggered. + */ + virtual void onFirstRemoteAudioFrame(uid_t uid, int elapsed) __deprecated { + (void)uid; + (void)elapsed; + } + + /** Occurs when the local audio state changes. + * + * When the state of the local audio stream changes (including the state of the audio capture and encoding), the SDK + * triggers this callback to report the current state. This callback indicates the state of the local audio stream, + * and allows you to troubleshoot issues when audio exceptions occur. + * + * @note + * When the state is `LOCAL_AUDIO_STREAM_STATE_FAILED(3)`, see the `error` + * parameter for details. + * + * @param state State of the local audio. See #LOCAL_AUDIO_STREAM_STATE. + * @param reason The reason information of the local audio. + * See #LOCAL_AUDIO_STREAM_REASON. + */ + virtual void onLocalAudioStateChanged(LOCAL_AUDIO_STREAM_STATE state, LOCAL_AUDIO_STREAM_REASON reason) { + (void)state; + (void)reason; + } + + /** Occurs when the remote audio state changes. + * + * When the audio state of a remote user (in the voice/video call channel) or host (in the live streaming channel) + * changes, the SDK triggers this callback to report the current state of the remote audio stream. + * + * @note This callback does not work properly when the number of users (in the voice/video call channel) or hosts + * (in the live streaming channel) in the channel exceeds 17. + * + * @param uid ID of the remote user whose audio state changes. + * @param state State of the remote audio. See #REMOTE_AUDIO_STATE. + * @param reason The reason of the remote audio state change. + * See #REMOTE_AUDIO_STATE_REASON. + * @param elapsed Time elapsed (ms) from the local user calling the + * `joinChannel` method until the SDK + * triggers this callback. + */ + virtual void onRemoteAudioStateChanged(uid_t uid, REMOTE_AUDIO_STATE state, REMOTE_AUDIO_STATE_REASON reason, int elapsed) { + (void)uid; + (void)state; + (void)reason; + (void)elapsed; + } + + /** + * Occurs when an active speaker is detected. + * + * After a successful call of `enableAudioVolumeIndication`, the SDK continuously detects which remote user has the + * loudest volume. During the current period, the remote user, who is detected as the loudest for the most times, + * is the most active user. + * + * When the number of users is no less than two and an active remote speaker exists, the SDK triggers this callback and reports the uid of the most active remote speaker. + * - If the most active remote speaker is always the same user, the SDK triggers the `onActiveSpeaker` callback only once. + * - If the most active remote speaker changes to another user, the SDK triggers this callback again and reports the uid of the new active remote speaker. + * + * @param userId The ID of the active speaker. A `uid` of 0 means the local user. + */ + virtual void onActiveSpeaker(uid_t uid) { + (void)uid; + } + + /** Reports result of content inspection. + * + * @param result The result of content inspection: #CONTENT_INSPECT_RESULT. + */ + virtual void onContentInspectResult(media::CONTENT_INSPECT_RESULT result) { (void)result; } + + /** Reports the result of taking a video snapshot. + * + * After a successful `takeSnapshot` method call, the SDK triggers this callback to report whether the snapshot is + * successfully taken, as well as the details for that snapshot. + * + * @param uid The user ID. A `uid` of 0 indicates the local user. + * @param filePath The local path of the snapshot. + * @param width The width (px) of the snapshot. + * @param height The height (px) of the snapshot. + * @param errCode The message that confirms success or gives the reason why the snapshot is not successfully taken: + * - 0: Success. + * - < 0: Failure. + * - -1: The SDK fails to write data to a file or encode a JPEG image. + * - -2: The SDK does not find the video stream of the specified user within one second after the `takeSnapshot` method call succeeds. + * - -3: Calling the `takeSnapshot` method too frequently. Call the `takeSnapshot` method after receiving the `onSnapshotTaken` + * callback from the previous call. + */ + virtual void onSnapshotTaken(uid_t uid, const char* filePath, int width, int height, int errCode) { + (void)uid; + (void)filePath; + (void)width; + (void)height; + (void)errCode; + } + + /** + * Occurs when the user role switches in the interactive live streaming. + * + * @param oldRole The old role of the user: #CLIENT_ROLE_TYPE. + * @param newRole The new role of the user: #CLIENT_ROLE_TYPE. + * @param newRoleOptions The client role options of the new role: #ClientRoleOptions. + */ + virtual void onClientRoleChanged(CLIENT_ROLE_TYPE oldRole, CLIENT_ROLE_TYPE newRole, const ClientRoleOptions& newRoleOptions) { + (void)oldRole; + (void)newRole; + (void)newRoleOptions; + } + + /** + * Occurs when the user role in a Live-Broadcast channel fails to switch, for example, from a broadcaster + * to an audience or vice versa. + * + * @param reason The reason for failing to change the client role: #CLIENT_ROLE_CHANGE_FAILED_REASON. + * @param currentRole The current role of the user: #CLIENT_ROLE_TYPE. + */ + virtual void onClientRoleChangeFailed(CLIENT_ROLE_CHANGE_FAILED_REASON reason, CLIENT_ROLE_TYPE currentRole) { + (void)reason; + (void)currentRole; + } + + /** Occurs when the audio device volume changes. + @param deviceType The device type, see #MEDIA_DEVICE_TYPE + @param volume The volume of the audio device. + @param muted Whether the audio device is muted: + - true: The audio device is muted. + - false: The audio device is not muted. + */ + virtual void onAudioDeviceVolumeChanged(MEDIA_DEVICE_TYPE deviceType, int volume, bool muted) { + (void)deviceType; + (void)volume; + (void)muted; + } + + /** + * Occurs when the state of the RTMP streaming changes. + * + * When the media push state changes, the SDK triggers this callback and reports the URL address and the current state + * of the media push. This callback indicates the state of the media push. When exceptions occur, you can troubleshoot + * issues by referring to the detailed error descriptions in the error code. + * + * @param url The URL address where the state of the media push changes. + * @param state The current state of the media push: #RTMP_STREAM_PUBLISH_STATE. + * @param reason The detailed error information for the media push: #RTMP_STREAM_PUBLISH_REASON. + */ + virtual void onRtmpStreamingStateChanged(const char* url, RTMP_STREAM_PUBLISH_STATE state, + RTMP_STREAM_PUBLISH_REASON reason) { + (void)url; + (void)state; + (void)reason; + } + + /** Reports events during the media push. + * + * @param url The URL for media push. + * @param eventCode The event code of media push. See RTMP_STREAMING_EVENT for details. + */ + virtual void onRtmpStreamingEvent(const char* url, RTMP_STREAMING_EVENT eventCode) { + (void)url; + (void)eventCode; + } + + /** + * Occurs when the publisher's transcoding settings are updated. + * + * When the `LiveTranscoding` class in \ref IRtcEngine::setLiveTranscoding "setLiveTranscoding" + * updates, the SDK triggers this callback to report the update information. + * + * @note + * If you call the `setLiveTranscoding` method to set the `LiveTranscoding` class for the first time, the SDK + * does not trigger this callback. + */ + virtual void onTranscodingUpdated() {} + + /** Occurs when the local audio route changes (for Android, iOS, and macOS only). + + The SDK triggers this callback when the local audio route switches to an + earpiece, speakerphone, headset, or Bluetooth device. + @param routing The current audio output routing: + - -1: Default. + - 0: Headset. + - 1: Earpiece. + - 2: Headset with no microphone. + - 3: Speakerphone. + - 4: Loudspeaker. + - 5: Bluetooth headset. + */ + virtual void onAudioRoutingChanged(int routing) { (void)routing; } + + /** + * Occurs when the state of the media stream relay changes. + * + * The SDK reports the state of the current media relay and possible error messages in this + * callback. + * + * @param state The state code: + * - `RELAY_STATE_IDLE(0)`: The SDK is initializing. + * - `RELAY_STATE_CONNECTING(1)`: The SDK tries to relay the media stream to the destination + * channel. + * - `RELAY_STATE_RUNNING(2)`: The SDK successfully relays the media stream to the destination + * channel. + * - `RELAY_STATE_FAILURE(3)`: A failure occurs. See the details in `code`. + * @param code The error code: + * - `RELAY_OK(0)`: The state is normal. + * - `RELAY_ERROR_SERVER_ERROR_RESPONSE(1)`: An error occurs in the server response. + * - `RELAY_ERROR_SERVER_NO_RESPONSE(2)`: No server response. You can call the leaveChannel method + * to leave the channel. + * - `RELAY_ERROR_NO_RESOURCE_AVAILABLE(3)`: The SDK fails to access the service, probably due to + * limited resources of the server. + * - `RELAY_ERROR_FAILED_JOIN_SRC(4)`: Fails to send the relay request. + * - `RELAY_ERROR_FAILED_JOIN_DEST(5)`: Fails to accept the relay request. + * - `RELAY_ERROR_FAILED_PACKET_RECEIVED_FROM_SRC(6)`: The server fails to receive the media + * stream. + * - `RELAY_ERROR_FAILED_PACKET_SENT_TO_DEST(7)`: The server fails to send the media stream. + * - `RELAY_ERROR_SERVER_CONNECTION_LOST(8)`: The SDK disconnects from the server due to poor + * network connections. You can call the leaveChannel method to leave the channel. + * - `RELAY_ERROR_INTERNAL_ERROR(9)`: An internal error occurs in the server. + * - `RELAY_ERROR_SRC_TOKEN_EXPIRED(10)`: The token of the source channel has expired. + * - `RELAY_ERROR_DEST_TOKEN_EXPIRED(11)`: The token of the destination channel has expired. + */ + virtual void onChannelMediaRelayStateChanged(int state, int code) { + (void)state; + (void)code; + } + + /** + * Occurs when the published media stream falls back to an audio-only stream due to poor network conditions or + * switches back to video stream after the network conditions improve. + * + * If you call `setLocalPublishFallbackOption` and set `option` as `STREAM_FALLBACK_OPTION_AUDIO_ONLY(2)`, this + * callback is triggered when the locally published stream falls back to audio-only mode due to poor uplink + * conditions, or when the audio stream switches back to the video after the uplink network condition improves. + * Once the published stream falls back to audio only, the remote app receives the `onRemoteVideoStateChanged` callback. + * + * @param isFallbackOrRecover Whether the published stream fell back to audio-only or switched back to the video: + * - `true`: The published stream fell back to audio-only due to poor network conditions. + * - `false`: The published stream switched back to the video after the network conditions improved. + */ + virtual void onLocalPublishFallbackToAudioOnly(bool isFallbackOrRecover) { + (void)isFallbackOrRecover; + } + + /** + * Occurs when the remote media stream falls back to audio-only stream due to poor network conditions or + * switches back to video stream after the network conditions improve. + * + * If you call `setRemoteSubscribeFallbackOption` and set `option` as `STREAM_FALLBACK_OPTION_AUDIO_ONLY(2)`, this + * callback is triggered when the remotely subscribed media stream falls back to audio-only mode due to poor downlink + * conditions, or when the remotely subscribed media stream switches back to the video after the downlink network + * condition improves. + * + * @note Once the remote media stream is switched to the low stream due to poor network conditions, you can monitor + * the stream switch between a high and low stream in the `onRemoteVideoStats` callback. + * + * @param uid ID of the remote user sending the stream. + * @param isFallbackOrRecover Whether the remote media stream fell back to audio-only or switched back to the video: + * - `true`: The remote media stream fell back to audio-only due to poor network conditions. + * - `false`: The remote media stream switched back to the video stream after the network conditions improved. + */ + virtual void onRemoteSubscribeFallbackToAudioOnly(uid_t uid, bool isFallbackOrRecover) { + (void)uid; + (void)isFallbackOrRecover; + } + + /** Reports the transport-layer statistics of each remote audio stream. + * @deprecated Use `onRemoteAudioStats` instead. + + This callback reports the transport-layer statistics, such as the packet loss rate and network time delay, once every + two seconds after the local user receives an audio packet from a remote user. During a call, when the user receives + the audio packet sent by the remote user/host, the callback is triggered every 2 seconds. + + @param uid ID of the remote user whose audio data packet is received. + @param delay The network time delay (ms) from the sender to the receiver. + @param lost The Packet loss rate (%) of the audio packet sent from the remote + user. + @param rxKBitRate Received bitrate (Kbps) of the audio packet sent from the + remote user. + */ + virtual void onRemoteAudioTransportStats(uid_t uid, unsigned short delay, unsigned short lost, unsigned short rxKBitRate) __deprecated { + (void)uid; + (void)delay; + (void)lost; + (void)rxKBitRate; + } + + /** Reports the transport-layer statistics of each remote video stream. + * @deprecated Use `onRemoteVideoStats` instead. + + This callback reports the transport-layer statistics, such as the packet loss rate and network time + delay, once every two seconds after the local user receives a video packet from a remote user. + + During a call, when the user receives the video packet sent by the remote user/host, the callback is + triggered every 2 seconds. + + @param uid ID of the remote user whose video packet is received. + @param delay The network time delay (ms) from the remote user sending the + video packet to the local user. + @param lost The packet loss rate (%) of the video packet sent from the remote + user. + @param rxKBitRate The bitrate (Kbps) of the video packet sent from + the remote user. + */ + virtual void onRemoteVideoTransportStats(uid_t uid, unsigned short delay, unsigned short lost, unsigned short rxKBitRate) __deprecated { + (void)uid; + (void)delay; + (void)lost; + (void)rxKBitRate; + } + + /** Occurs when the network connection state changes. + * + * When the network connection state changes, the SDK triggers this callback and reports the current + * connection state and the reason for the change. + + @param state The current connection state. See #CONNECTION_STATE_TYPE. + @param reason The reason for a connection state change. See #CONNECTION_CHANGED_REASON_TYPE. + */ + virtual void onConnectionStateChanged( + CONNECTION_STATE_TYPE state, CONNECTION_CHANGED_REASON_TYPE reason) { + (void)state; + (void)reason; + } + + /** Occurs when the WIFI message need be sent to the user. + * + * @param reason The reason of notifying the user of a message. + * @param action Suggest an action for the user. + * @param wlAccMsg The message content of notifying the user. + */ + virtual void onWlAccMessage(WLACC_MESSAGE_REASON reason, WLACC_SUGGEST_ACTION action, const char* wlAccMsg) { + (void)reason; + (void)action; + (void)wlAccMsg; + } + + /** Occurs when SDK statistics wifi acceleration optimization effect. + * + * @param currentStats Instantaneous value of optimization effect. + * @param averageStats Average value of cumulative optimization effect. + */ + virtual void onWlAccStats(const WlAccStats& currentStats, const WlAccStats& averageStats) { + (void)currentStats; + (void)averageStats; + } + + /** Occurs when the local network type changes. + * + * This callback occurs when the connection state of the local user changes. You can get the + * connection state and reason for the state change in this callback. When the network connection + * is interrupted, this callback indicates whether the interruption is caused by a network type + * change or poor network conditions. + + @param type The type of the local network connection. See #NETWORK_TYPE. + */ + virtual void onNetworkTypeChanged(NETWORK_TYPE type) { + (void)type; + } + + /** Reports the built-in encryption errors. + * + * When encryption is enabled by calling `enableEncryption`, the SDK triggers this callback if an + * error occurs in encryption or decryption on the sender or the receiver side. + + @param errorType The error type. See #ENCRYPTION_ERROR_TYPE. + */ + virtual void onEncryptionError(ENCRYPTION_ERROR_TYPE errorType) { + (void)errorType; + } + + /** Occurs when the SDK cannot get the device permission. + * + * When the SDK fails to get the device permission, the SDK triggers this callback to report which + * device permission cannot be got. + * + * @note This method is for Android and iOS only. + + @param permissionType The type of the device permission. See #PERMISSION_TYPE. + */ + virtual void onPermissionError(PERMISSION_TYPE permissionType) { + (void)permissionType; + } + + /** Occurs when the local user registers a user account. + * + * After the local user successfully calls `registerLocalUserAccount` to register the user account + * or calls `joinChannelWithUserAccount` to join a channel, the SDK triggers the callback and + * informs the local user's UID and User Account. + + @param uid The ID of the local user. + @param userAccount The user account of the local user. + */ + virtual void onLocalUserRegistered(uid_t uid, const char* userAccount) { + (void)uid; + (void)userAccount; + } + + /** Occurs when the SDK gets the user ID and user account of the remote user. + + After a remote user joins the channel, the SDK gets the UID and user account of the remote user, + caches them in a mapping table object (`userInfo`), and triggers this callback on the local client. + + @param uid The ID of the remote user. + @param info The `UserInfo` object that contains the user ID and user account of the remote user. + */ + virtual void onUserInfoUpdated(uid_t uid, const UserInfo& info) { + (void)uid; + (void)info; + } + + /** + * Occurs when the user account is updated. + * + * @param uid The user ID. + * @param userAccount The user account. + */ + virtual void onUserAccountUpdated(uid_t uid, const char* userAccount){ + (void)uid; + (void)userAccount; + } + + /** + * Reports the tracing result of video rendering event of the user. + * + * @param uid The user ID. + * @param currentEvent The current event of the tracing result: #MEDIA_TRACE_EVENT. + * @param tracingInfo The tracing result: #VideoRenderingTracingInfo. + */ + virtual void onVideoRenderingTracingResult(uid_t uid, MEDIA_TRACE_EVENT currentEvent, VideoRenderingTracingInfo tracingInfo) { + (void)uid; + (void)currentEvent; + (void)tracingInfo; + } + + /** + * Occurs when local video transcoder stream has an error. + * + * @param stream Stream type of TranscodingVideoStream. + * @param error Error code of VIDEO_TRANSCODER_ERROR. + */ + virtual void onLocalVideoTranscoderError(const TranscodingVideoStream& stream, VIDEO_TRANSCODER_ERROR error){ + (void)stream; + (void)error; + } + + /** + * Reports the user log upload result + * @param requestId RequestId of the upload + * @param success Is upload success + * @param reason Reason of the upload, 0: OK, 1 Network Error, 2 Server Error. + */ + virtual void onUploadLogResult(const char* requestId, bool success, UPLOAD_ERROR_REASON reason) { + (void)requestId; + (void)success; + (void)reason; + } + + /** + * Occurs when the audio subscribing state changes. + * + * @param channel The name of the channel. + * @param uid The ID of the remote user. + * @param oldState The previous subscribing status: #STREAM_SUBSCRIBE_STATE. + * @param newState The current subscribing status: #STREAM_SUBSCRIBE_STATE. + * @param elapseSinceLastState The time elapsed (ms) from the previous state to the current state. + */ + virtual void onAudioSubscribeStateChanged(const char* channel, uid_t uid, STREAM_SUBSCRIBE_STATE oldState, STREAM_SUBSCRIBE_STATE newState, int elapseSinceLastState) { + (void)channel; + (void)uid; + (void)oldState; + (void)newState; + (void)elapseSinceLastState; + } + + /** + * Occurs when the video subscribing state changes. + * + * @param channel The name of the channel. + * @param uid The ID of the remote user. + * @param oldState The previous subscribing status: #STREAM_SUBSCRIBE_STATE. + * @param newState The current subscribing status: #STREAM_SUBSCRIBE_STATE. + * @param elapseSinceLastState The time elapsed (ms) from the previous state to the current state. + */ + virtual void onVideoSubscribeStateChanged(const char* channel, uid_t uid, STREAM_SUBSCRIBE_STATE oldState, STREAM_SUBSCRIBE_STATE newState, int elapseSinceLastState) { + (void)channel; + (void)uid; + (void)oldState; + (void)newState; + (void)elapseSinceLastState; + } + + /** + * Occurs when the audio publishing state changes. + * + * @param channel The name of the channel. + * @param oldState The previous publishing state: #STREAM_PUBLISH_STATE. + * @param newState The current publishing state: #STREAM_PUBLISH_STATE. + * @param elapseSinceLastState The time elapsed (ms) from the previous state to the current state. + */ + virtual void onAudioPublishStateChanged(const char* channel, STREAM_PUBLISH_STATE oldState, STREAM_PUBLISH_STATE newState, int elapseSinceLastState) { + (void)channel; + (void)oldState; + (void)newState; + (void)elapseSinceLastState; + } + + /** + * Occurs when the video publishing state changes. + * + * @param source The video source type. + * @param channel The name of the channel. + * @param oldState The previous publishing state: #STREAM_PUBLISH_STATE. + * @param newState The current publishing state: #STREAM_PUBLISH_STATE. + * @param elapseSinceLastState The time elapsed (ms) from the previous state to the current state. + */ + virtual void onVideoPublishStateChanged(VIDEO_SOURCE_TYPE source, const char* channel, STREAM_PUBLISH_STATE oldState, STREAM_PUBLISH_STATE newState, int elapseSinceLastState) { + (void)source; + (void)channel; + (void)oldState; + (void)newState; + (void)elapseSinceLastState; + } + + /** + * Occurs when receive a video transcoder stream which has video layout info. + * + * @param uid user id of the transcoded stream. + * @param width width of the transcoded stream. + * @param height height of the transcoded stream. + * @param layoutCount count of layout info in the transcoded stream. + * @param layoutlist video layout info list of the transcoded stream. + */ + virtual void onTranscodedStreamLayoutInfo(uid_t uid, int width, int height, int layoutCount,const VideoLayout* layoutlist) { + (void)uid; + (void)width; + (void)height; + (void)layoutCount; + (void)layoutlist; + } + + /** + * The event callback of the extension. + * + * To listen for events while the extension is running, you need to register this callback. + * + * @param provider The name of the extension provider. + * @param extension The name of the extension. + * @param key The key of the extension. + * @param value The value of the extension key. + */ + virtual void onExtensionEvent(const char* provider, const char* extension, const char* key, const char* value) { + (void)provider; + (void)extension; + (void)key; + (void)value; + } + + /** + * Occurs when the extension is enabled. + * + * After a successful call of `enableExtension(true)`, the extension triggers this callback. + * + * @param provider The name of the extension provider. + * @param extension The name of the extension. + */ + virtual void onExtensionStarted(const char* provider, const char* extension) { + (void)provider; + (void)extension; + } + + /** + * Occurs when the extension is disabled. + * + * After a successful call of `enableExtension(false)`, the extension triggers this callback. + * + * @param provider The name of the extension provider. + * @param extension The name of the extension. + */ + virtual void onExtensionStopped(const char* provider, const char* extension) { + (void)provider; + (void)extension; + } + + /** + * Occurs when the extension runs incorrectly. + * + * When calling `enableExtension(true)` fails or the extension runs in error, the extension triggers + * this callback and reports the error code and reason. + * + * @param provider The name of the extension provider. + * @param extension The name of the extension. + * @param error The error code. For details, see the extension documentation provided by the extension provider. + * @param message The error message. For details, see the extension documentation provided by the extension provider. + */ + virtual void onExtensionError(const char* provider, const char* extension, int error, const char* message) { + (void)provider; + (void)extension; + (void)error; + (void)message; + } + + /** + * Occurs when the SDK receives RTM setting change response. + * + * @technical preview + * @param code The error code. + */ + virtual void onSetRtmFlagResult(int code) { + (void)code; + } +}; + +/** + * The IVideoDeviceCollection class. You can get information related to video devices through this interface. + */ +class IVideoDeviceCollection { + public: + virtual ~IVideoDeviceCollection() {} + + /** + * Gets the total number of the indexed video capture devices in the system. + * + * @return The total number of the indexed video capture devices. + */ + virtual int getCount() = 0; + + /** + * Specifies a device with the device ID. + * + * @param deviceIdUTF8 The device ID. The maximum length is #MAX_DEVICE_ID_LENGTH_TYPE. Plugging or + * unplugging the audio device does not change the value of deviceId. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDevice(const char deviceIdUTF8[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets a specified piece of information about an indexed video device. + * + * @param index The index value of the video device. The value of this parameter must be less than the value returned in `getCount`. + * @param deviceNameUTF8 The name of the device. The maximum length is #MAX_DEVICE_ID_LENGTH. + * @param deviceIdUTF8 The device ID of the video device. The maximum length is #MAX_DEVICE_ID_LENGTH. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getDevice(int index, char deviceNameUTF8[MAX_DEVICE_ID_LENGTH], + char deviceIdUTF8[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Releases all the resources occupied by the IVideoDeviceCollection object. + */ + virtual void release() = 0; +}; + +/** + * The IVideoDeviceManager class. + */ +class IVideoDeviceManager { + public: + virtual ~IVideoDeviceManager() {} + /** + * Enumerates the video devices. + * + * This method returns an `IVideoDeviceCollection` object including all video devices in the system. + * With the `IVideoDeviceCollection` object, the application can enumerate video devices. The + * application must call the release method to release the returned object after using it. + * + * @return + * - Success: An `IVideoDeviceCollection` object including all video devices in the system. + * - Failure: NULL. + */ + virtual IVideoDeviceCollection* enumerateVideoDevices() = 0; + + /** + * Specifies the video capture device with the device ID. + * + * @param deviceIdUTF8 he device ID. You can get the device ID by calling `enumerateVideoDevices`. + * The maximum length is #MAX_DEVICE_ID_LENGTH. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDevice(const char deviceIdUTF8[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Retrieves the current video capture device. + * @param deviceIdUTF8 Output parameter. The device ID. The maximum length is #MAX_DEVICE_ID_LENGTH_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getDevice(char deviceIdUTF8[MAX_DEVICE_ID_LENGTH]) = 0; + +#if defined(_WIN32) || (defined(__linux__) && !defined(__ANDROID__)) || \ + (defined(__APPLE__) && TARGET_OS_MAC && !TARGET_OS_IPHONE) + /** + * Gets the number of video formats supported by the specified video capture device. + * + * Video capture devices may support multiple video formats, and each format supports different + * combinations of video frame width, video frame height, and frame rate. + * + * You can call this method to get how many video formats the specified video capture device can + * support, and then call `getCapability` to get the specific video frame information in the + * specified video format. + * + * @param deviceIdUTF8 The ID of the video capture device. + * + * @return + * - 0: Success. Returns the number of video formats supported by this device. For example: If the + * specified camera supports 10 different video formats, the return value is 10. + * - < 0: Failure. + */ + virtual int numberOfCapabilities(const char* deviceIdUTF8) = 0; + + /** + * Gets the detailed video frame information of the video capture device in the specified video format. + * + * After calling `numberOfCapabilities` to get the number of video formats supported by the video capture + * device, you can call this method to get the specific video frame information supported by the + * specified index number. + * + * @param deviceIdUTF8 ID of the video capture device. + * @param deviceCapabilityNumber The index number of the video format. If the return value of `numberOfCapabilities` + * is i, the value range of this parameter is [0,i). + * @param capability Output parameter. Indicates the specific information of the specified video format, + * including width (px), height (px), and frame rate (fps). See VideoFormat. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getCapability(const char* deviceIdUTF8, const uint32_t deviceCapabilityNumber, VideoFormat& capability) = 0; +#endif + /** + * Starts the video capture device test. + * + * This method tests whether the video capture device works properly. + * Before calling this method, ensure that you have already called + * \ref IRtcEngine::enableVideo "enableVideo", and the HWND window handle of + * the incoming parameter is valid. + * + * @param hwnd An Output parameter that specifies the window handle to display the video. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startDeviceTest(view_t hwnd) = 0; + + /** + * Stops the video capture device test. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopDeviceTest() = 0; + + /** + * Releases all the resources occupied by the `IVideoDeviceManager` object. + */ + virtual void release() = 0; +}; + +/** + * The context of IRtcEngine. + */ +struct RtcEngineContext { + /** + * The event handler for IRtcEngine. + */ + IRtcEngineEventHandler* eventHandler; + /** + * The App ID issued by Agora for your project. Only users in apps with the same App ID can join the + * same channel and communicate with each other. An App ID can only be used to create one `IRtcEngine` + * instance. To change your App ID, call release to destroy the current IRtcEngine instance, and then + * create a new one. + */ + const char* appId; + /** + * - For Android, it is the context of Activity or Application. + * - For Windows, it is the window handle of app. Once set, this parameter enables you to plug + * or unplug the video devices while they are powered. + */ + void* context; + /** + * The channel profile. See #CHANNEL_PROFILE_TYPE. + */ + CHANNEL_PROFILE_TYPE channelProfile; + + /** + * The license used for verification when connectting channel. Charge according to the license + */ + const char* license; + + /** + * The audio application scenario. See #AUDIO_SCENARIO_TYPE. + * + * @note Agora recommends the following scenarios: + * - `AUDIO_SCENARIO_DEFAULT(0)` + * - `AUDIO_SCENARIO_GAME_STREAMING(3)` + */ + AUDIO_SCENARIO_TYPE audioScenario; + /** + * The region for connection. This is an advanced feature and applies to scenarios that have regional restrictions. + * + * For the regions that Agora supports, see #AREA_CODE. The area codes support bitwise operation. + * + * After specifying the region, the app integrated with the Agora SDK connects to the Agora servers + * within that region. + */ + unsigned int areaCode; + + /** + * The log files that the SDK outputs. See LogConfig. + * + * By default, the SDK generates five SDK log files and five API call log files with the following rules: + * - The SDK log files are: `agorasdk.log`, `agorasdk.1.log`, `agorasdk.2.log`, `agorasdk.3.log`, and `agorasdk.4.log`. + * - The API call log files are: `agoraapi.log`, `agoraapi.1.log`, `agoraapi.2.log`, `agoraapi.3.log`, and `agoraapi.4.log`. + * - The default size for each SDK log file is 1,024 KB; the default size for each API call log file is 2,048 KB. These log files are encoded in UTF-8. + * - The SDK writes the latest logs in `agorasdk.log` or `agoraapi.log`. + * - When `agorasdk.log` is full, the SDK processes the log files in the following order: + * - Delete the `agorasdk.4.log` file (if any). + * - Rename `agorasdk.3.log` to `agorasdk.4.log`. + * - Rename `agorasdk.2.log` to `agorasdk.3.log`. + * - Rename `agorasdk.1.log` to `agorasdk.2.log`. + * - Create a new `agorasdk.log` file. + */ + commons::LogConfig logConfig; + + /** + * Thread priority for SDK common threads + */ + Optional threadPriority; + + /** + * Whether to use egl context in the current thread as sdk鈥榮 root egl context, + * which is shared by all egl related modules. eg. camera capture, video renderer. + * + * @note + * This property applies to Android only. + */ + bool useExternalEglContext; + + /** + * Determines whether to enable domain limit + * -true: only connect to servers which already parsed by DNS + * -false: (Default) connect to servers with no limit + */ + bool domainLimit; + + /** + * Whether to automatically register Agora extensions when initializing RtcEngine. + * -true: (Default) Automatically register Agora extensions. + * -false: Do not automatically register Agora extensions. The user calls EnableExtension to manually register an Agora extension. + */ + bool autoRegisterAgoraExtensions; + + RtcEngineContext() + : eventHandler(NULL), appId(NULL), context(NULL), channelProfile(CHANNEL_PROFILE_LIVE_BROADCASTING), + license(NULL), audioScenario(AUDIO_SCENARIO_DEFAULT), areaCode(AREA_CODE_GLOB), + logConfig(), useExternalEglContext(false), domainLimit(false), autoRegisterAgoraExtensions(true) {} +}; + +/** Definition of IMetadataObserver +*/ +class IMetadataObserver { +public: + virtual ~IMetadataObserver() {} + + /** The metadata type. + * + * @note We only support video metadata for now. + */ + enum METADATA_TYPE + { + /** -1: (Not supported) Unknown. + */ + UNKNOWN_METADATA = -1, + /** 0: (Supported) Video metadata. + */ + VIDEO_METADATA = 0, + }; + /** + * The maximum metadata size. + */ + enum MAX_METADATA_SIZE_TYPE + { + INVALID_METADATA_SIZE_IN_BYTE = -1, + DEFAULT_METADATA_SIZE_IN_BYTE = 512, + MAX_METADATA_SIZE_IN_BYTE = 1024 + }; + + /** Metadata. + */ + struct Metadata + { + /** The User ID that sent the metadata. + * - For the receiver: The user ID of the user who sent the `metadata`. + * - For the sender: Ignore this value. + */ + unsigned int uid; + /** The buffer size of the sent or received `metadata`. + */ + unsigned int size; + /** The buffer address of the sent or received `metadata`. + */ + unsigned char* buffer; + /** The timestamp (ms) of the `metadata`. + * + */ + long long timeStampMs; + + Metadata() : uid(0), size(0), buffer(NULL), timeStampMs(0) {} + }; + + /** Occurs when the SDK requests the maximum size of the metadata. + * + * + * After successfully complete the registration by calling `registerMediaMetadataObserver`, the SDK + * triggers this callback once every video frame is sent. You need to specify the maximum size of + * the metadata in the return value of this callback. + * + * @return The maximum size of the buffer of the metadata that you want to use. The highest value is + * 1024 bytes. Ensure that you set the return value. + */ + virtual int getMaxMetadataSize() { return DEFAULT_METADATA_SIZE_IN_BYTE; } + + /** Occurs when the local user receives the metadata. + + @note Ensure that the size of the metadata does not exceed the value set in the `getMaxMetadataSize` callback. + + @param metadata The metadata that the user wants to send. For details, see Metadata. + @param source_type The video data type: #VIDEO_SOURCE_TYPE. + @return + - true: Send. + - false: Do not send. + */ + virtual bool onReadyToSendMetadata(Metadata &metadata, VIDEO_SOURCE_TYPE source_type) = 0; + + /** Occurs when the local user receives the metadata. + * + * @param metadata The metadata received. See Metadata. + * + * @note If the receiver is audience, the receiver cannot get the NTP timestamp (ms) + * that the metadata sends. + */ + virtual void onMetadataReceived(const Metadata& metadata) = 0; +}; + +// The reason codes for media streaming +// GENERATED_JAVA_ENUM_PACKAGE: io.agora.streaming +enum DIRECT_CDN_STREAMING_REASON { + // No error occurs. + DIRECT_CDN_STREAMING_REASON_OK = 0, + // A general error occurs (no specified reason). + DIRECT_CDN_STREAMING_REASON_FAILED = 1, + // Audio publication error. + DIRECT_CDN_STREAMING_REASON_AUDIO_PUBLICATION = 2, + // Video publication error. + DIRECT_CDN_STREAMING_REASON_VIDEO_PUBLICATION = 3, + + DIRECT_CDN_STREAMING_REASON_NET_CONNECT = 4, + // Already exist stream name. + DIRECT_CDN_STREAMING_REASON_BAD_NAME = 5, +}; + +// The connection state of media streaming +// GENERATED_JAVA_ENUM_PACKAGE: io.agora.streaming +enum DIRECT_CDN_STREAMING_STATE { + + DIRECT_CDN_STREAMING_STATE_IDLE = 0, + + DIRECT_CDN_STREAMING_STATE_RUNNING = 1, + + DIRECT_CDN_STREAMING_STATE_STOPPED = 2, + + DIRECT_CDN_STREAMING_STATE_FAILED = 3, + + DIRECT_CDN_STREAMING_STATE_RECOVERING = 4, +}; + +/** + * The statistics of the Direct Cdn Streams. + */ +struct DirectCdnStreamingStats { + /** + * Width of the video pushed by rtmp. + */ + int videoWidth; + + /** + * Height of the video pushed by rtmp. + */ + int videoHeight; + + /** + * The frame rate of the video pushed by rtmp. + */ + int fps; + + /** + * Real-time bit rate of the video streamed by rtmp. + */ + int videoBitrate; + + /** + * Real-time bit rate of the audio pushed by rtmp. + */ + int audioBitrate; +}; + +/** + * The event handler for direct cdn streaming + * + */ +class IDirectCdnStreamingEventHandler { + public: + virtual ~IDirectCdnStreamingEventHandler() {} + + /** + * Event callback of direct cdn streaming + * @param state Current status + * @param reason Reason Code + * @param message Message + */ + virtual void onDirectCdnStreamingStateChanged(DIRECT_CDN_STREAMING_STATE state, DIRECT_CDN_STREAMING_REASON reason, const char* message) { + (void)state; + (void)reason; + (void)message; + }; + + virtual void onDirectCdnStreamingStats(const DirectCdnStreamingStats& stats) { + (void)stats; + }; +}; + +/** + * The channel media options. + */ +struct DirectCdnStreamingMediaOptions { + /** + * Determines whether to publish the video of the camera track. + * - true: Publish the video track of the camera capturer. + * - false: (Default) Do not publish the video track of the camera capturer. + */ + Optional publishCameraTrack; + /** + * Determines whether to publish the recorded audio. + * - true: Publish the recorded audio. + * - false: (Default) Do not publish the recorded audio. + */ + Optional publishMicrophoneTrack; + /** + * Determines whether to publish the audio of the custom audio track. + * - true: Publish the audio of the custom audio track. + * - false: (Default) Do not publish the audio of the custom audio track. + */ + Optional publishCustomAudioTrack; + /** + * Determines whether to publish the video of the custom video track. + * - true: Publish the video of the custom video track. + * - false: (Default) Do not publish the video of the custom video track. + */ + Optional publishCustomVideoTrack; + /** + * Determines whether to publish the audio track of media player source. + * - true: Publish the audio track of media player source. + * - false: (Default) Do not publish the audio track of media player source. + */ + Optional publishMediaPlayerAudioTrack; + /** + * Determines which media player source should be published. + * You can get the MediaPlayerId after calling getMediaPlayerId() of AgoraMediaPlayer. + */ + Optional publishMediaPlayerId; + /** + * The custom video track id which will used to publish. + * You can get the VideoTrackId after calling createCustomVideoTrack() of IRtcEngine. + */ + Optional customVideoTrackId; + + DirectCdnStreamingMediaOptions() {} + ~DirectCdnStreamingMediaOptions() {} + + void SetAll(const DirectCdnStreamingMediaOptions& change) { +#define SET_FROM(X) SetFrom(&X, change.X) + SET_FROM(publishCameraTrack); + SET_FROM(publishMicrophoneTrack); + SET_FROM(publishCustomAudioTrack); + SET_FROM(publishCustomVideoTrack); + SET_FROM(publishMediaPlayerAudioTrack); + SET_FROM(publishMediaPlayerId); + SET_FROM(customVideoTrackId); +#undef SET_FROM + } + + bool operator==(const DirectCdnStreamingMediaOptions& o) const { +#define BEGIN_COMPARE() bool b = true +#define ADD_COMPARE(X) b = (b && (X == o.X)) +#define END_COMPARE() + + BEGIN_COMPARE(); + ADD_COMPARE(publishCameraTrack); + ADD_COMPARE(publishMicrophoneTrack); + ADD_COMPARE(publishCustomAudioTrack); + ADD_COMPARE(publishCustomVideoTrack); + ADD_COMPARE(publishMediaPlayerAudioTrack); + ADD_COMPARE(customVideoTrackId); + ADD_COMPARE(publishMediaPlayerId); + END_COMPARE(); + +#undef BEGIN_COMPARE +#undef ADD_COMPARE +#undef END_COMPARE + return b; + } + + DirectCdnStreamingMediaOptions& operator=(const DirectCdnStreamingMediaOptions& replace) { + if (this != &replace) { +#define REPLACE_BY(X) ReplaceBy(&X, replace.X) + + REPLACE_BY(publishCameraTrack); + REPLACE_BY(publishMicrophoneTrack); + REPLACE_BY(publishCustomAudioTrack); + REPLACE_BY(publishCustomVideoTrack); + REPLACE_BY(publishMediaPlayerAudioTrack); + REPLACE_BY(customVideoTrackId); + REPLACE_BY(publishMediaPlayerId); +#undef REPLACE_BY + } + return *this; + } +}; + +/** + * The information for extension. + */ +struct ExtensionInfo { + /** + * The type of media device. + */ + agora::media::MEDIA_SOURCE_TYPE mediaSourceType; + + /** + * The id of the remote user on which the extension works. + * + * @note remoteUid = 0 means that the extension works on all remote streams. + */ + uid_t remoteUid; + + /** + * The unique channel name for the AgoraRTC session in the string format. The string + * length must be less than 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", + * "-", + * ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + */ + const char* channelId; + + /** + * User ID: A 32-bit unsigned integer ranging from 1 to (2^32-1). It must be unique. + */ + uid_t localUid; + + ExtensionInfo() : mediaSourceType(agora::media::UNKNOWN_MEDIA_SOURCE), remoteUid(0), channelId(NULL), localUid(0) {} +}; + +class IMediaPlayer; +class IMediaRecorder; + +/** + * The IRtcEngine class, which is the basic interface of the Agora SDK that implements the core functions of real-time communication. + * + * `IRtcEngine` provides the main methods that your app can call. + * + */ +class IRtcEngine : public agora::base::IEngineBase { + public: + /** + * Releases the IRtcEngine object. + * + * This method releases all resources used by the Agora SDK. Use this method for apps in which users + * occasionally make voice or video calls. When users do not make calls, you can free up resources for + * other operations. + * + * After a successful method call, you can no longer use any method or callback in the SDK anymore. + * If you want to use the real-time communication functions again, you must call `createAgoraRtcEngine` + * and `initialize` to create a new `IRtcEngine` instance. + * + * @note If you want to create a new `IRtcEngine` instance after destroying the current one, ensure + * that you wait till the `release` method execution to complete. + * + * @param sync Determines whether this method is a synchronous call. + * - `true`: This method is a synchronous call, which means that the result of this method call + * returns after the IRtcEngine object resources are released. Do not call this method + * in any callback generated by the SDK, or it may result in a deadlock. The SDK automatically + * detects the deadlock and turns this method into an asynchronous call, but the test itself takes + * extra time. + * - `false`: This method is an asynchronous call. The result returns immediately even when the + * IRtcEngine object resources are not released. + * + */ + virtual void release(bool sync = false) = 0; + + /** + * Initializes `IRtcEngine`. + * + * All called methods provided by the `IRtcEngine` class are executed asynchronously. Agora + * recommends calling these methods in the same thread. + * + * @note + * - Before calling other APIs, you must call `createAgoraRtcEngine` and `initialize `to create and + * initialize the `IRtcEngine` object. + * - The SDK supports creating only one `IRtcEngine` instance for an app. + * + * @param context The RtcEngineContext object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int initialize(const RtcEngineContext& context) = 0; + + /** + * Gets the pointer to the specified interface. + * + * @param iid The ID of the interface. See #INTERFACE_ID_TYPE for details. + * @param inter Output parameter. The pointer to the specified interface. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int queryInterface(INTERFACE_ID_TYPE iid, void** inter) = 0; + + + /** + * Gets the SDK version. + * @param build The build number. + * @return The version of the current SDK in the string format. + */ + virtual const char* getVersion(int* build) = 0; + + /** + * Gets the warning or error description. + * @param code The error code or warning code reported by the SDK. + * @return The specific error or warning description. + */ + virtual const char* getErrorDescription(int code) = 0; + + /** + * Queries the capacity of the current device codec. + * + * @param codec_info An array of the codec cap information: CodecCapInfo. + * @param size The array size. + * @return + * 0: Success. + * < 0: Failure. + */ + virtual int queryCodecCapability(CodecCapInfo* codecInfo, int& size) = 0; + + /** + * Queries the score of the current device. + * + * @return + * > 0: If the value is greater than 0, it means that the device score has been retrieved and represents the score value. + * Most devices score between 60-100, with higher scores indicating better performance. + * + * < 0: Failure. + */ + virtual int queryDeviceScore() = 0; + + /** + * Preload a channel. + * + * This method enables users to preload a channel. + * + * A successful call of this method will reduce the time of joining the same channel. + * + * Note: + * 1. The SDK supports preloading up to 20 channels. Once the preloaded channels exceed the limit, the SDK will keep the latest 20 available. + * 2. Renew the token of the preloaded channel by calling this method with the same 'channelId' and 'uid'. + * + * @param token The token generated on your server for authentication. + * @param channelId The channel name. This parameter signifies the channel in which users engage in + * real-time audio and video interaction. Under the premise of the same App ID, users who fill in + * the same channel ID enter the same channel for audio and video interaction. The string length + * must be less than 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", + * ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param uid The user ID. This parameter is used to identify the user in the channel for real-time + * audio and video interaction. You need to set and manage user IDs yourself, and ensure that each + * user ID in the same channel is unique. This parameter is a 32-bit unsigned integer. The value + * range is 1 to 232-1. If the user ID is not assigned (or set to 0), the SDK assigns a random user + * ID and returns it in the onJoinChannelSuccess callback. Your application must record and maintain + * the returned user ID, because the SDK does not do so. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -7: The IRtcEngine object has not been initialized. You need to initialize the IRtcEngine + * object before calling this method. + * - -102: The channel name is invalid. You need to pass in a valid channel name in channelId to + * preload the channel again. + */ + virtual int preloadChannel(const char* token, const char* channelId, uid_t uid) = 0; + + /** + * Preload a channel. + * + * This method enables users to preload a channel. + * + * A successful call of this method will reduce the time of joining the same channel. + * + * Note: + * 1. The SDK supports preloading up to 20 channels. Once the preloaded channels exceed the limit, the SDK will keep the latest 20 available. + * 2. Renew the token of the preloaded channel by calling this method with the same 'channelId' and 'userAccount'. + * + * @param token The token generated on your server for authentication. + * @param channelId The channel name. This parameter signifies the channel in which users engage in + * real-time audio and video interaction. Under the premise of the same App ID, users who fill in + * the same channel ID enter the same channel for audio and video interaction. The string length + * must be less than 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", + * ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param userAccount The user account. The maximum length of this parameter is 255 bytes. Ensure that you set this parameter and do not set it as null. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2: The parameter is invalid. For example, the userAccount parameter is empty. + * You need to pass in a valid parameter and preload the channel again. + * - -7: The IRtcEngine object has not been initialized. You need to initialize the IRtcEngine + * object before calling this method. + * - -102: The channel name is invalid. You need to pass in a valid channel name in channelId to + * preload the channel again. + */ + virtual int preloadChannelWithUserAccount(const char* token, const char* channelId, const char* userAccount) = 0; + + /** + * Update token of the preloaded channels. + * + * An easy way to update all preloaded channels' tokens, if all preloaded channels use the same token. + * + * If preloaded channels use different tokens, we need to call the 'preloadChannel' method with the same 'channelId' + * and 'uid' or 'userAccount' to update the corresponding token. + * + * @param token The token generated on your server for authentication. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2: The token is invalid. You need to pass in a valid token and update the token again. + * - -7: The IRtcEngine object has not been initialized. You need to initialize the IRtcEngine + * object before calling this method. + */ + virtual int updatePreloadChannelToken(const char* token) = 0; + + /** + * Joins a channel. + * + * This method enables users to join a channel. Users in the same channel can talk to each other, + * and multiple users in the same channel can start a group chat. Users with different App IDs + * cannot call each other. + * + * A successful call of this method triggers the following callbacks: + * - The local client: The `onJoinChannelSuccess` and `onConnectionStateChanged` callbacks. + * - The remote client: `onUserJoined`, if the user joining the channel is in the Communication + * profile or is a host in the Live-broadcasting profile. + * + * When the connection between the client and Agora's server is interrupted due to poor network + * conditions, the SDK tries reconnecting to the server. When the local client successfully rejoins + * the channel, the SDK triggers the `onRejoinChannelSuccess` callback on the local client. + * + * @note Once a user joins the channel, the user subscribes to the audio and video streams of all + * the other users in the channel by default, giving rise to usage and billing calculation. To + * stop subscribing to a specified stream or all remote streams, call the corresponding `mute` methods. + * + * @param token The token generated on your server for authentication. + * @param channelId The channel name. This parameter signifies the channel in which users engage in + * real-time audio and video interaction. Under the premise of the same App ID, users who fill in + * the same channel ID enter the same channel for audio and video interaction. The string length + * must be less than 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", + * ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param info (Optional) Reserved for future use. + * @param uid The user ID. This parameter is used to identify the user in the channel for real-time + * audio and video interaction. You need to set and manage user IDs yourself, and ensure that each + * user ID in the same channel is unique. This parameter is a 32-bit unsigned integer. The value + * range is 1 to 232-1. If the user ID is not assigned (or set to 0), the SDK assigns a random user + * ID and returns it in the onJoinChannelSuccess callback. Your application must record and maintain + * the returned user ID, because the SDK does not do so. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2: The parameter is invalid. For example, the token is invalid, the uid parameter is not set + * to an integer, or the value of a member in the `ChannelMediaOptions` structure is invalid. You need + * to pass in a valid parameter and join the channel again. + * - -3: Failes to initialize the `IRtcEngine` object. You need to reinitialize the IRtcEngine object. + * - -7: The IRtcEngine object has not been initialized. You need to initialize the IRtcEngine + * object before calling this method. + * - -8: The internal state of the IRtcEngine object is wrong. The typical cause is that you call + * this method to join the channel without calling `stopEchoTest` to stop the test after calling + * `startEchoTest` to start a call loop test. You need to call `stopEchoTest` before calling this method. + * - -17: The request to join the channel is rejected. The typical cause is that the user is in the + * channel. Agora recommends using the `onConnectionStateChanged` callback to get whether the user is + * in the channel. Do not call this method to join the channel unless you receive the + * `CONNECTION_STATE_DISCONNECTED(1)` state. + * - -102: The channel name is invalid. You need to pass in a valid channel name in channelId to + * rejoin the channel. + * - -121: The user ID is invalid. You need to pass in a valid user ID in uid to rejoin the channel. + */ + virtual int joinChannel(const char* token, const char* channelId, const char* info, uid_t uid) = 0; + + /** + * Joins a channel with media options. + * + * This method enables users to join a channel. Users in the same channel can talk to each other, + * and multiple users in the same channel can start a group chat. Users with different App IDs + * cannot call each other. + * + * A successful call of this method triggers the following callbacks: + * - The local client: The `onJoinChannelSuccess` and `onConnectionStateChanged` callbacks. + * - The remote client: `onUserJoined`, if the user joining the channel is in the Communication + * profile or is a host in the Live-broadcasting profile. + * + * When the connection between the client and Agora's server is interrupted due to poor network + * conditions, the SDK tries reconnecting to the server. When the local client successfully rejoins + * the channel, the SDK triggers the `onRejoinChannelSuccess` callback on the local client. + * + * Compared to `joinChannel`, this method adds the options parameter to configure whether to + * automatically subscribe to all remote audio and video streams in the channel when the user + * joins the channel. By default, the user subscribes to the audio and video streams of all + * the other users in the channel, giving rise to usage and billings. To unsubscribe, set the + * `options` parameter or call the `mute` methods accordingly. + * + * @note + * - This method allows users to join only one channel at a time. + * - Ensure that the app ID you use to generate the token is the same app ID that you pass in the + * `initialize` method; otherwise, you may fail to join the channel by token. + * + * @param token The token generated on your server for authentication. + * + * @param channelId The channel name. This parameter signifies the channel in which users engage in + * real-time audio and video interaction. Under the premise of the same App ID, users who fill in + * the same channel ID enter the same channel for audio and video interaction. The string length + * must be less than 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", + * ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param uid The user ID. This parameter is used to identify the user in the channel for real-time + * audio and video interaction. You need to set and manage user IDs yourself, and ensure that each + * user ID in the same channel is unique. This parameter is a 32-bit unsigned integer. The value + * range is 1 to 232-1. If the user ID is not assigned (or set to 0), the SDK assigns a random user + * ID and returns it in the `onJoinChannelSuccess` callback. Your application must record and maintain + * the returned user ID, because the SDK does not do so. + * @param options The channel media options: ChannelMediaOptions. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2: The parameter is invalid. For example, the token is invalid, the uid parameter is not set + * to an integer, or the value of a member in the `ChannelMediaOptions` structure is invalid. You need + * to pass in a valid parameter and join the channel again. + * - -3: Failes to initialize the `IRtcEngine` object. You need to reinitialize the IRtcEngine object. + * - -7: The IRtcEngine object has not been initialized. You need to initialize the IRtcEngine + * object before calling this method. + * - -8: The internal state of the IRtcEngine object is wrong. The typical cause is that you call + * this method to join the channel without calling `stopEchoTest` to stop the test after calling + * `startEchoTest` to start a call loop test. You need to call `stopEchoTest` before calling this method. + * - -17: The request to join the channel is rejected. The typical cause is that the user is in the + * channel. Agora recommends using the `onConnectionStateChanged` callback to get whether the user is + * in the channel. Do not call this method to join the channel unless you receive the + * `CONNECTION_STATE_DISCONNECTED(1)` state. + * - -102: The channel name is invalid. You need to pass in a valid channel name in channelId to + * rejoin the channel. + * - -121: The user ID is invalid. You need to pass in a valid user ID in uid to rejoin the channel. + */ + virtual int joinChannel(const char* token, const char* channelId, uid_t uid, const ChannelMediaOptions& options) = 0; + + /** + * Updates the channel media options after joining the channel. + * + * @param options The channel media options: ChannelMediaOptions. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int updateChannelMediaOptions(const ChannelMediaOptions& options) = 0; + + /** + * Leaves the channel. + * + * This method allows a user to leave the channel, for example, by hanging up or exiting a call. + * + * This method is an asynchronous call, which means that the result of this method returns even before + * the user has not actually left the channel. Once the user successfully leaves the channel, the + * SDK triggers the \ref IRtcEngineEventHandler::onLeaveChannel "onLeaveChannel" callback. + * + * @note + * If you call \ref release "release" immediately after this method, the leaveChannel process will be + * interrupted, and the SDK will not trigger the `onLeaveChannel` callback. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int leaveChannel() = 0; + + /** + * Leaves the channel. + * + * @param options The leave channel options. + * + * This method allows a user to leave the channel, for example, by hanging up or exiting a call. + * + * This method is an asynchronous call, which means that the result of this method returns even before + * the user has not actually left the channel. Once the user successfully leaves the channel, the + * SDK triggers the \ref IRtcEngineEventHandler::onLeaveChannel "onLeaveChannel" callback. + * + * @note + * If you call \ref release "release" immediately after this method, the leaveChannel process will be + * interrupted, and the SDK will not trigger the `onLeaveChannel` callback. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int leaveChannel(const LeaveChannelOptions& options) = 0; + + /** + * Renews the token. + * + * Once a token is enabled and used, it expires after a certain period of time. + * + * Under the following circumstances, generate a new token on your server, and then call this method to + * renew it. Failure to do so results in the SDK disconnecting from the server. + * - The \ref IRtcEngineEventHandler onTokenPrivilegeWillExpire "onTokenPrivilegeWillExpire" callback is triggered; + * - The \ref IRtcEngineEventHandler::onRequestToken "onRequestToken" callback is triggered; + * - The `ERR_TOKEN_EXPIRED(-109)` error is reported. + * + * @param token The new token. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int renewToken(const char* token) = 0; + + /** + * Sets the channel profile. + * + * The IRtcEngine differentiates channel profiles and applies different optimization algorithms accordingly. + * For example, it prioritizes smoothness and low latency for a video call, and prioritizes video quality + * for a video broadcast. + * + * @note + * - To ensure the quality of real-time communication, we recommend that all users in a channel use the + * same channel profile. + * - Call this method before calling `joinChannel`. You cannot set the channel profile + * once you have joined the channel. + * + * @param profile The channel profile: #CHANNEL_PROFILE_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setChannelProfile(CHANNEL_PROFILE_TYPE profile) = 0; + + /** + * Sets the role of a user. + * + * This method sets the user role as either BROADCASTER or AUDIENCE (default). + * - The broadcaster sends and receives streams. + * - The audience receives streams only. + * + * By default, all users are audience regardless of the channel profile. + * Call this method to change the user role to BROADCASTER so that the user can + * send a stream. + * + * @note + * After calling the setClientRole() method to CLIENT_ROLE_AUDIENCE, the SDK stops audio recording. + * However, CLIENT_ROLE_AUDIENCE will keep audio recording with AUDIO_SCENARIO_CHATROOM(5). + * Normally, app developer can also use mute api to achieve the same result, and we implement + * this 'non-orthogonal' behavior only to make API backward compatible. + * + * @param role The role of the client: #CLIENT_ROLE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setClientRole(CLIENT_ROLE_TYPE role) = 0; + + /** Sets the role of the user, such as a host or an audience (default), before joining a channel in the live interactive streaming. + * + * This method can be used to switch the user role in the live interactive streaming after the user joins a channel. + * + * In the `LIVE_BROADCASTING` profile, when a user switches user roles after joining a channel, a successful \ref agora::rtc::IRtcEngine::setClientRole "setClientRole" method call triggers the following callbacks: + * - The local client: \ref agora::rtc::IRtcEngineEventHandler::onClientRoleChanged "onClientRoleChanged" + * - The remote client: \ref agora::rtc::IRtcEngineEventHandler::onUserJoined "onUserJoined" or \ref agora::rtc::IRtcEngineEventHandler::onUserOffline "onUserOffline" (BECOME_AUDIENCE) + * + * @note + * This method applies only to the `LIVE_BROADCASTING` profile. + * + * @param role Sets the role of the user. See #CLIENT_ROLE_TYPE. + * @param options Sets the audience latency level of the user. See #ClientRoleOptions. + * + * @return + * - 0(ERR_OK): Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INALID_ARGUMENT): The parameter is invalid. + * - -7(ERR_NOT_INITIALIZED): The SDK is not initialized. + */ + virtual int setClientRole(CLIENT_ROLE_TYPE role, const ClientRoleOptions& options) = 0; + + /** Starts an audio call test. + + This method launches an audio call test to determine whether the audio devices + (for example, headset and speaker) and the network connection are working + properly. + + In the test, the user first speaks, and the recording is played back + in 10 seconds. If the user can hear the recording in 10 seconds, it indicates + that the audio devices and network connection work properly. + + @note + After calling the startEchoTest() method, always call stopEchoTest() to end + the test. Otherwise, the app cannot run the next echo test, nor can + it call the joinChannel() method. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startEchoTest() = 0; + + /** Starts an audio call test. + + This method starts an audio call test to determine whether the audio devices (for example, headset and speaker) and the network connection are working properly. + + In the audio call test, you record your voice. If the recording plays back within the set time interval, the audio devices and the network connection are working properly. + + @note + - Call this method before joining a channel. + - After calling this method, call the \ref IRtcEngine::stopEchoTest "stopEchoTest" method to end the test. Otherwise, the app cannot run the next echo test, or call the \ref IRtcEngine::joinChannel "joinChannel" method. + - In the `LIVE_BROADCASTING` profile, only a host can call this method. + @param intervalInSeconds The time interval (s) between when you speak and when the recording plays back. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startEchoTest(int intervalInSeconds) = 0; + + /** Starts a video call test. + * + * @param config: configuration for video call test. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startEchoTest(const EchoTestConfiguration& config) = 0; + + /** Stops the audio call test. + @return int + + - 0: Success. + - < 0: Failure. + */ + virtual int stopEchoTest() = 0; + +#if defined(__APPLE__) && TARGET_OS_IOS + /** Enables the SDK use AVCaptureMultiCamSession or AVCaptureSession. Applies to iOS 13.0+ only. + * @param enabled Whether to enable multi-camera when capturing video: + * - true: Enable multi-camera, and the SDK uses AVCaptureMultiCamSession. + * - false: Disable multi-camera, and the SDK uses AVCaptureSession. + * @param config The config for secondary camera capture session. See #CameraCapturerConfiguration. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableMultiCamera(bool enabled, const CameraCapturerConfiguration& config) = 0; +#endif + /** + * Enables the video. + * + * You can call this method either before joining a channel or during a call. + * If you call this method before entering a channel, the service starts the video; if you call it + * during a call, the audio call switches to a video call. + * + * @note + * This method controls the underlying states of the Engine. It is still + * valid after one leaves the channel. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableVideo() = 0; + + /** + * Disables the video. + * + * This method stops capturing the local video and receiving any remote video. + * To enable the local preview function, call \ref enableLocalVideo "enableLocalVideo" (true). + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int disableVideo() = 0; + + /** + * Starts the local video preview before joining a channel. + * + * Once you call this method to start the local video preview, if you leave + * the channel by calling \ref leaveChannel "leaveChannel", the local video preview remains until + * you call \ref stopPreview "stopPreview" to disable it. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startPreview() = 0; + + /** + * Starts the local video preview for specific source type. + * @param sourceType - The video source type. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startPreview(VIDEO_SOURCE_TYPE sourceType) = 0; + + /** + * Stops the local video preview and the video. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopPreview() = 0; + + /** + * Stops the local video preview for specific source type. + * @param sourceType - The video source type. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopPreview(VIDEO_SOURCE_TYPE sourceType) = 0; + + /** Starts the last-mile network probe test. + + This method starts the last-mile network probe test before joining a channel + to get the uplink and downlink last-mile network statistics, including the + bandwidth, packet loss, jitter, and round-trip time (RTT). + + Call this method to check the uplink network quality before users join a + channel or before an audience switches to a host. Once this method is + enabled, the SDK returns the following callbacks: + - \ref IRtcEngineEventHandler::onLastmileQuality "onLastmileQuality": the + SDK triggers this callback depending on the network + conditions. This callback rates the network conditions and is more closely + linked to the user experience. + - \ref IRtcEngineEventHandler::onLastmileProbeResult "onLastmileProbeResult": + the SDK triggers this callback within 30 seconds depending on the network + conditions. This callback returns the real-time statistics of the network + conditions and is more objective. + + @note + - Do not call other methods before receiving the + \ref IRtcEngineEventHandler::onLastmileQuality "onLastmileQuality" and + \ref IRtcEngineEventHandler::onLastmileProbeResult "onLastmileProbeResult" + callbacks. Otherwise, the callbacks may be interrupted. + - In the Live-broadcast profile, a host should not call this method after + joining a channel. + + @param config Sets the configurations of the last-mile network probe test. See + LastmileProbeConfig. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startLastmileProbeTest(const LastmileProbeConfig& config) = 0; + + /** Stops the last-mile network probe test. */ + virtual int stopLastmileProbeTest() = 0; + + /** + * Sets the video encoder configuration. + * + * Each configuration profile corresponds to a set of video parameters, including + * the resolution, frame rate, and bitrate. + * + * The parameters specified in this method are the maximum values under ideal network conditions. + * If the video engine cannot render the video using the specified parameters due + * to poor network conditions, the parameters further down the list are considered + * until a successful configuration is found. + * + * @param config The local video encoder configuration: VideoEncoderConfiguration. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVideoEncoderConfiguration(const VideoEncoderConfiguration& config) = 0; + + /** Enables/Disables image enhancement and sets the options. + * + * @note Call this method after calling the \ref IRtcEngine::enableVideo "enableVideo" method. + * + * @param enabled Sets whether or not to enable image enhancement: + * - true: enables image enhancement. + * - false: disables image enhancement. + * @param options Sets the image enhancement option. See BeautyOptions. + */ + virtual int setBeautyEffectOptions(bool enabled, const BeautyOptions& options, agora::media::MEDIA_SOURCE_TYPE type = agora::media::PRIMARY_CAMERA_SOURCE) = 0; + /** + * Sets low-light enhancement. + * + * @since v4.0.0 + * + * The low-light enhancement feature can adaptively adjust the brightness value of the video captured in situations with low or uneven lighting, such as backlit, cloudy, or dark scenes. It restores or highlights the image details and improves the overall visual effect of the video. + * + * You can call this method to enable the low-light enhancement feature and set the options of the low-light enhancement effect. + * + * @note + * - Before calling this method, ensure that you have integrated the following dynamic library into your project: + * - Android: `libagora_segmentation_extension.so` + * - iOS/macOS: `AgoraVideoSegmentationExtension.xcframework` + * - Windows: `libagora_segmentation_extension.dll` + * - Call this method after \ref IRtcEngine::enableVideo "enableVideo". + * - The low-light enhancement feature has certain performance requirements on devices. If your device overheats after you enable low-light enhancement, Agora recommends modifying the low-light enhancement options to a less performance-consuming level or disabling low-light enhancement entirely. + * + * @param enabled Sets whether to enable low-light enhancement: + * - `true`: Enable. + * - `false`: (Default) Disable. + * @param options The low-light enhancement options. See LowlightEnhanceOptions. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLowlightEnhanceOptions(bool enabled, const LowlightEnhanceOptions& options, agora::media::MEDIA_SOURCE_TYPE type = agora::media::PRIMARY_CAMERA_SOURCE) = 0; + /** + * Sets video noise reduction. + * + * @since v4.0.0 + * + * Underlit environments and low-end video capture devices can cause video images to contain significant noise, which affects video quality. In real-time interactive scenarios, video noise also consumes bitstream resources and reduces encoding efficiency during encoding. + * + * You can call this method to enable the video noise reduction feature and set the options of the video noise reduction effect. + * + * @note + * - Before calling this method, ensure that you have integrated the following dynamic library into your project: + * - Android: `libagora_segmentation_extension.so` + * - iOS/macOS: `AgoraVideoSegmentationExtension.xcframework` + * - Windows: `libagora_segmentation_extension.dll` + * - Call this method after \ref IRtcEngine::enableVideo "enableVideo". + * - The video noise reduction feature has certain performance requirements on devices. If your device overheats after you enable video noise reduction, Agora recommends modifying the video noise reduction options to a less performance-consuming level or disabling video noise reduction entirely. + * + * @param enabled Sets whether to enable video noise reduction: + * - `true`: Enable. + * - `false`: (Default) Disable. + * @param options The video noise reduction options. See VideoDenoiserOptions. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVideoDenoiserOptions(bool enabled, const VideoDenoiserOptions& options, agora::media::MEDIA_SOURCE_TYPE type = agora::media::PRIMARY_CAMERA_SOURCE) = 0; + /** + * Sets color enhancement. + * + * @since v4.0.0 + * + * The video images captured by the camera can have color distortion. The color enhancement feature intelligently adjusts video characteristics such as saturation and contrast to enhance the video color richness and color reproduction, making the video more vivid. + * + * You can call this method to enable the color enhancement feature and set the options of the color enhancement effect. + * + * @note + * - Before calling this method, ensure that you have integrated the following dynamic library into your project: + * - Android: `libagora_segmentation_extension.so` + * - iOS/macOS: `AgoraVideoSegmentationExtension.xcframework` + * - Windows: `libagora_segmentation_extension.dll` + * - Call this method after \ref IRtcEngine::enableVideo "enableVideo". + * - The color enhancement feature has certain performance requirements on devices. If your device overheats after you enable color enhancement, Agora recommends modifying the color enhancement options to a less performance-consuming level or disabling color enhancement entirely. + * + * @param enabled Sets whether to enable color enhancement: + * - `true`: Enable. + * - `false`: (Default) Disable. + * @param options The color enhancement options. See ColorEnhanceOptions. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setColorEnhanceOptions(bool enabled, const ColorEnhanceOptions& options, agora::media::MEDIA_SOURCE_TYPE type = agora::media::PRIMARY_CAMERA_SOURCE) = 0; + + /** + * Enables/Disables the virtual background. (beta function) + * + * @since v3.7.200 + * + * After enabling the virtual background function, you can replace the original background image of the local user + * with a custom background image. After the replacement, all users in the channel can see the custom background + * image. + * + * @note + * - Before calling this method, ensure that you have integrated the + * `libagora_segmentation_extension.dll` (Windows)/`AgoraVideoSegmentationExtension.framework` (macOS) dynamic + * library into the project folder. + * - Call this method after \ref IRtcEngine::enableVideo "enableVideo". + * - This function requires a high-performance device. Agora recommends that you use this function on devices with + * an i5 CPU and better. + * - Agora recommends that you use this function in scenarios that meet the following conditions: + * - A high-definition camera device is used, and the environment is uniformly lit. + * - The captured video image is uncluttered, the user's portrait is half-length and largely unobstructed, and the + * background is a single color that differs from the color of the user's clothing. + * + * @param enabled Sets whether to enable the virtual background: + * - true: Enable. + * - false: Disable. + * @param backgroundSource The custom background image. See VirtualBackgroundSource. **Note**: To adapt the + * resolution of the custom background image to the resolution of the SDK capturing video, the SDK scales and crops + * the custom background image while ensuring that the content of the custom background image is not distorted. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableVirtualBackground(bool enabled, VirtualBackgroundSource backgroundSource, SegmentationProperty segproperty, agora::media::MEDIA_SOURCE_TYPE type = agora::media::PRIMARY_CAMERA_SOURCE) = 0; + + /** + * Initializes the video view of a remote user. + * + * This method initializes the video view of a remote stream on the local device. It affects only the + * video view that the local user sees. + * + * Usually the app should specify the `uid` of the remote video in the method call before the + * remote user joins the channel. If the remote `uid` is unknown to the app, set it later when the + * app receives the \ref IRtcEngineEventHandler::onUserJoined "onUserJoined" callback. + * + * To unbind the remote user from the view, set `view` in VideoCanvas as `null`. + * + * @note + * Ensure that you call this method in the UI thread. + * + * @param canvas The remote video view settings: VideoCanvas. + * @return int + * VIRTUAL_BACKGROUND_SOURCE_STATE_REASON_SUCCESS = 0, + * VIRTUAL_BACKGROUND_SOURCE_STATE_REASON_IMAGE_NOT_EXIST = -1, + * VIRTUAL_BACKGROUND_SOURCE_STATE_REASON_COLOR_FORMAT_NOT_SUPPORTED = -2, + * VIRTUAL_BACKGROUND_SOURCE_STATE_REASON_DEVICE_NOT_SUPPORTED = -3, + */ + virtual int setupRemoteVideo(const VideoCanvas& canvas) = 0; + + /** + * Initializes the local video view. + * + * This method initializes the video view of the local stream on the local device. It affects only + * the video view that the local user sees, not the published local video stream. + * + * To unbind the local video from the view, set `view` in VideoCanvas as `null`. + * + * @note + * Call this method before joining a channel. + * + * @param canvas The local video view setting: VideoCanvas. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setupLocalVideo(const VideoCanvas& canvas) = 0; + + /** + * Sets the Video application scenario. + * + * @since v4.2.0 + * + * You can call this method to set the expected video scenario. + * The SDK will optimize the video experience for each scenario you set. + * + * + * @param scenarioType The video application scenario. See #ApplicationScenarioType. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - ERR_FAILED (1): A general error occurs (no specified reason). + * - ERR_NOT_SUPPORTED (4): Unable to set video application scenario. + * - ERR_NOT_INITIALIZED (7): The SDK is not initialized. + */ + virtual int setVideoScenario(VIDEO_APPLICATION_SCENARIO_TYPE scenarioType) = 0; + + /** + * Sets the video qoe preference. + * + * @since v4.2.1 + * + * You can call this method to set the expected QoE Preference. + * The SDK will optimize the video experience for each preference you set. + * + * + * @param qoePreference The qoe preference type. See #VIDEO_QOE_PREFERENCE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - ERR_FAILED (1): A general error occurs (no specified reason). + * - ERR_NOT_SUPPORTED (4): Unable to set video application scenario. + * - ERR_NOT_INITIALIZED (7): The SDK is not initialized. + */ + virtual int setVideoQoEPreference(VIDEO_QOE_PREFERENCE_TYPE qoePreference) = 0; + + /** + * Enables the audio. + * + * The audio is enabled by default. + * + * @note + * This method controls the underlying states of the Engine. It is still + * valid after one leaves channel. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAudio() = 0; + + /** + * Disables the audio. + * + * @note + * This method controls the underlying states of the Engine. It is still + * valid after one leaves channel. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int disableAudio() = 0; + + /** + * Sets the audio parameters and application scenarios. + * + * @deprecated This method is deprecated. You can use the + * \ref IRtcEngine::setAudioProfile(AUDIO_PROFILE_TYPE) "setAudioProfile" + * method instead. To set the audio scenario, call the \ref IRtcEngine::initialize "initialize" + * method and pass value in the `audioScenario` member in the RtcEngineContext struct. + * + * @note + * - Call this method before calling the `joinChannel` method. + * - In scenarios requiring high-quality audio, we recommend setting `profile` as `MUSIC_HIGH_QUALITY`(4) + * and `scenario` as `AUDIO_SCENARIO_GAME_STREAMING`(3). + * + * @param profile Sets the sample rate, bitrate, encoding mode, and the number of channels: + * #AUDIO_PROFILE_TYPE. + * @param scenario Sets the audio application scenarios: #AUDIO_SCENARIO_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioProfile(AUDIO_PROFILE_TYPE profile, AUDIO_SCENARIO_TYPE scenario) __deprecated = 0; + + /** + * Sets the audio profile. + * + * @note + * - Call this method before calling the `joinChannel` method. + * - In scenarios requiring high-quality audio, Agora recommends setting `profile` as `MUSIC_HIGH_QUALITY`(4). + * - To set the audio scenario, call the \ref IRtcEngine::initialize "initialize" + * method and pass value in the `audioScenario` member in the RtcEngineContext struct. + * + * @param profile The audio profile, such as the sample rate, bitrate and codec type: #AUDIO_PROFILE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioProfile(AUDIO_PROFILE_TYPE profile) = 0; + /** + * Set the audio scenario. + * + * @param scenario The audio scenario: #AUDIO_SCENARIO_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioScenario(AUDIO_SCENARIO_TYPE scenario) = 0; + /** + * Enables or disables the local audio capture. + * + * The audio function is enabled by default. This method disables or re-enables the + * local audio function, that is, to stop or restart local audio capture and + * processing. + * + * This method does not affect receiving or playing the remote audio streams, + * and `enableLocalAudio` (false) is applicable to scenarios where the user wants + * to receive remote audio streams without sending any audio stream to other users + * in the channel. + * + * @param enabled Determines whether to disable or re-enable the local audio function: + * - true: (Default) Re-enable the local audio function, that is, to start local + * audio capture and processing. + * - false: Disable the local audio function, that is, to stop local audio + * capture and processing. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableLocalAudio(bool enabled) = 0; + + /** + Stops or resumes sending the local audio stream. + + After calling this method successfully, the SDK triggers the + \ref IRtcEngineEventHandler::onRemoteAudioStateChanged "onRemoteAudioStateChanged" + callback with the following parameters: + - REMOTE_AUDIO_STATE_STOPPED(0) and REMOTE_AUDIO_REASON_REMOTE_MUTED(5). + - REMOTE_AUDIO_STATE_DECODING(2) and REMOTE_AUDIO_REASON_REMOTE_UNMUTED(6). + + @note + - When `mute` is set as `true`, this method does not disable the + microphone, which does not affect any ongoing recording. + - If you call \ref IRtcEngine::setChannelProfile "setChannelProfile" after + this method, the SDK resets whether or not to mute the local audio + according to the channel profile and user role. Therefore, we recommend + calling this method after the `setChannelProfile` method. + + @param mute Determines whether to send or stop sending the local audio stream: + - true: Stop sending the local audio stream. + - false: (Default) Send the local audio stream. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int muteLocalAudioStream(bool mute) = 0; + + /** + Stops or resumes receiving all remote audio stream. + + This method works for all remote users that join or will join a channel + using the `joinChannel` method. It is + equivalent to the `autoSubscribeAudio` member in the ChannelMediaOptions + class. + - Ensure that you call this method after joining a channel. + - If you call muteAllRemoteAudioStreams(true) after joining a channel, the + local use stops receiving any audio stream from any user in the channel, + including any user who joins the channel after you call this method. + - If you call muteAllRemoteAudioStreams(true) after leaving a channel, the + local user does not receive any audio stream the next time the user joins a + channel. + + After you successfully call muteAllRemoteAudioStreams(true), you can take + the following actions: + - To resume receiving all remote audio streams, call + muteAllRemoteAudioStreams(false). + - To resume receiving the audio stream of a specified user, call + muteRemoteAudioStream(uid, false), where uid is the ID of the user whose + audio stream you want to resume receiving. + + @note + - The result of calling this method is affected by calling + \ref IRtcEngine::enableAudio "enableAudio" and + \ref IRtcEngine::disableAudio "disableAudio". Settings in + muteAllRemoteAudioStreams stop taking effect if either of the following occurs: + - Calling `enableAudio` after muteAllRemoteAudioStreams(true). + - Calling `disableAudio` after muteAllRemoteAudioStreams(false). + - This method only works for the channel created by calling + `joinChannel`. To set whether to receive remote + audio streams for a specific channel, Agora recommends using + `autoSubscribeAudio` in the ChannelMediaOptions class. + @param mute Whether to stop receiving remote audio streams: + - true: Stop receiving any remote audio stream. + - false: (Default) Resume receiving all remote audio streams. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int muteAllRemoteAudioStreams(bool mute) = 0; + + /** + * Determines whether to receive all remote audio streams by default. + * + * @deprecated This method is deprecated. To set whether to receive remote + * audio streams by default, call + * \ref IRtcEngine::muteAllRemoteAudioStreams "muteAllRemoteAudioStreams" + * before calling `joinChannel` + * + * Use this method to set whether to receive audio streams of subsequent peer + * users. Agora recommends calling it before joining a channel. + * + * A successful call of setDefaultMuteAllRemoteAudioStreams(true) results in + * that the local user not receiving any audio stream after joining a channel. + * @param mute Whether to receive remote audio streams by default: + * - true: Do not receive any remote audio stream by default. + * - false: (Default) Receive remote audio streams by default. + * + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDefaultMuteAllRemoteAudioStreams(bool mute) __deprecated = 0; + + /** + * Stops or resumes receiving the audio stream of a specified user. + * + * @note + * You can call this method before or after joining a channel. If a user + * leaves a channel, the settings in this method become invalid. + * + * @param uid The ID of the specified user. + * @param mute Whether to stop receiving the audio stream of the specified user: + * - true: Stop receiving the audio stream of the specified user. + * - false: (Default) Resume receiving the audio stream of the specified user. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int muteRemoteAudioStream(uid_t uid, bool mute) = 0; + + /** + * Stops or resumes sending the local video stream. + * + * @param mute Determines whether to send or stop sending the local video stream: + * - true: Stop sending the local video stream. + * - false: (Default) Send the local video stream. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int muteLocalVideoStream(bool mute) = 0; + + /** + * Disables or re-enables the local video capture. + * + * Once you enable the video using \ref enableVideo "enableVideo", the local video is enabled + * by default. This method disables or re-enables the local video capture. + * + * `enableLocalVideo(false)` applies to scenarios when the user wants to watch the remote video + * without sending any video stream to the other user. + * + * @note + * Call this method after `enableVideo`. Otherwise, this method may not work properly. + * + * @param enabled Determines whether to disable or re-enable the local video, including + * the capturer, renderer, and sender: + * - true: (Default) Re-enable the local video. + * - false: Disable the local video. Once the local video is disabled, the remote + * users can no longer receive the video stream of this user, while this user + * can still receive the video streams of other remote users. When you set + * `enabled` as `false`, this method does not require a local camera. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableLocalVideo(bool enabled) = 0; + + /** Stops or resumes receiving all remote video streams. + + This method works for all remote users that join or will join a channel + using the `joinChannel` method. It is + equivalent to the `autoSubscribeVideo` member in the ChannelMediaOptions + class. + - Ensure that you call this method after joining a channel. + - If you call muteAllRemoteVideoStreams(true) after joining a channel, the + local use stops receiving any video stream from any user in the channel, + including any user who joins the channel after you call this method. + - If you call muteAllRemoteVideoStreams(true) after leaving a channel, + the local user does not receive any video stream the next time the user + joins a channel. + + After you successfully call muteAllRemoteVideoStreams(true), you can take + the following actions: + - To resume receiving all remote video streams, call + muteAllRemoteVideoStreams(false). + - To resume receiving the video stream of a specified user, call + muteRemoteVideoStream(uid, false), where uid is the ID of the user whose + video stream you want to resume receiving. + + @note + - The result of calling this method is affected by calling + \ref IRtcEngine::enableVideo "enableVideo" and + \ref IRtcEngine::disableVideo "disableVideo". Settings in + muteAllRemoteVideoStreams stop taking effect if either of the following occurs: + - Calling `enableVideo` after muteAllRemoteVideoStreams(true). + - Calling `disableVideo` after muteAllRemoteVideoStreams(false). + - This method only works for the channel created by calling `joinChannel`. + To set whether to receive remote audio streams for a specific channel, Agora + recommends using `autoSubscribeVideo` in the ChannelMediaOptions class. + @param mute Whether to stop receiving remote video streams: + - true: Stop receiving any remote video stream. + - false: (Default) Resume receiving all remote video streams. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int muteAllRemoteVideoStreams(bool mute) = 0; + + /** + Determines whether to receive all remote video streams by default. + + @deprecated This method is deprecated. To set whether to receive remote + video streams by default, call + \ref IRtcEngine::muteAllRemoteVideoStreams "muteAllRemoteVideoStreams" + before calling `joinChannel`. + + Use this method to set whether to receive video streams of subsequent peer + users. Agora recommends calling it before joining a channel. + + A successful call of setDefaultMuteAllRemoteVideoStreams(true) results in + that the local user not receiving any video stream after joining a channel. + + @param mute Whether to receive remote video streams by default: + - true: Do not receive any remote video stream by default. + - false: (Default) Receive remote video streams by default. + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int setDefaultMuteAllRemoteVideoStreams(bool mute) __deprecated = 0; + + /** + * Sets the default stream type of the remote video if the remote user has enabled dual-stream. + * + * @param streamType Sets the default video stream type: #VIDEO_STREAM_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteDefaultVideoStreamType(VIDEO_STREAM_TYPE streamType) = 0; + + /** + * Stops or resumes receiving the video stream of a specified user. + * + * @note + * You can call this method before or after joining a channel. If a user + * leaves a channel, the settings in this method become invalid. + * + * @param uid The ID of the specified user. + * @param mute Whether to stop receiving the video stream of the specified user: + * - true: Stop receiving the video stream of the specified user. + * - false: (Default) Resume receiving the video stream of the specified user. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int muteRemoteVideoStream(uid_t uid, bool mute) = 0; + + /** + * Sets the remote video stream type. + * + * If the remote user has enabled the dual-stream mode, by default the SDK receives the high-stream video by + * Call this method to switch to the low-stream video. + * + * @note + * This method applies to scenarios where the remote user has enabled the dual-stream mode using + * \ref enableDualStreamMode "enableDualStreamMode"(true) before joining the channel. + * + * @param uid ID of the remote user sending the video stream. + * @param streamType Sets the video stream type: #VIDEO_STREAM_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteVideoStreamType(uid_t uid, VIDEO_STREAM_TYPE streamType) = 0; + + /** + * Sets the remote video subscription options + * + * + * @param uid ID of the remote user sending the video stream. + * @param options Sets the video subscription options. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteVideoSubscriptionOptions(uid_t uid, const VideoSubscriptionOptions &options) = 0; + + /** + * Sets the blocklist of subscribe remote stream audio. + * + * @param uidList The id list of users whose audio you do not want to subscribe to. + * @param uidNumber The number of uid in uidList. + * + * @note + * If uid is in uidList, the remote user's audio will not be subscribed, + * even if muteRemoteAudioStream(uid, false) and muteAllRemoteAudioStreams(false) are operated. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeAudioBlocklist(uid_t* uidList, int uidNumber) = 0; + + /** + * Sets the allowlist of subscribe remote stream audio. + * + * @param uidList The id list of users whose audio you want to subscribe to. + * @param uidNumber The number of uid in uidList. + * + * @note + * If uid is in uidList, the remote user's audio will be subscribed, + * even if muteRemoteAudioStream(uid, true) and muteAllRemoteAudioStreams(true) are operated. + * + * If a user is in the blocklist and allowlist at the same time, only the blocklist takes effect. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeAudioAllowlist(uid_t* uidList, int uidNumber) = 0; + + /** + * Sets the blocklist of subscribe remote stream video. + * + * @param uidList The id list of users whose video you do not want to subscribe to. + * @param uidNumber The number of uid in uidList. + * + * @note + * If uid is in uidList, the remote user's video will not be subscribed, + * even if muteRemoteVideoStream(uid, false) and muteAllRemoteVideoStreams(false) are operated. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeVideoBlocklist(uid_t* uidList, int uidNumber) = 0; + + /** + * Sets the allowlist of subscribe remote stream video. + * + * @param uidList The id list of users whose video you want to subscribe to. + * @param uidNumber The number of uid in uidList. + * + * @note + * If uid is in uidList, the remote user's video will be subscribed, + * even if muteRemoteVideoStream(uid, true) and muteAllRemoteVideoStreams(true) are operated. + * + * If a user is in the blocklist and allowlist at the same time, only the blocklist takes effect. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeVideoAllowlist(uid_t* uidList, int uidNumber) = 0; + + /** + * Enables the `onAudioVolumeIndication` callback to report on which users are speaking + * and the speakers' volume. + * + * Once the \ref IRtcEngineEventHandler::onAudioVolumeIndication "onAudioVolumeIndication" + * callback is enabled, the SDK returns the volume indication in the at the time interval set + * in `enableAudioVolumeIndication`, regardless of whether any user is speaking in the channel. + * + * @param interval Sets the time interval between two consecutive volume indications: + * - <= 0: Disables the volume indication. + * - > 0: Time interval (ms) between two consecutive volume indications, + * and should be integral multiple of 200 (less than 200 will be set to 200). + * @param smooth The smoothing factor that sets the sensitivity of the audio volume + * indicator. The value range is [0, 10]. The greater the value, the more sensitive the + * indicator. The recommended value is 3. + * @param reportVad + * - `true`: Enable the voice activity detection of the local user. Once it is enabled, the `vad` parameter of the + * `onAudioVolumeIndication` callback reports the voice activity status of the local user. + * - `false`: (Default) Disable the voice activity detection of the local user. Once it is disabled, the `vad` parameter + * of the `onAudioVolumeIndication` callback does not report the voice activity status of the local user, except for + * the scenario where the engine automatically detects the voice activity of the local user. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAudioVolumeIndication(int interval, int smooth, bool reportVad) = 0; + + /** Starts an audio recording. + + The SDK allows recording during a call, which supports either one of the + following two formats: + + - .wav: Large file size with high sound fidelity + - .aac: Small file size with low sound fidelity + + Ensure that the directory to save the recording file exists and is writable. + This method is usually called after the joinChannel() method. + The recording automatically stops when the leaveChannel() method is + called. + + @param filePath Full file path of the recording file. The string of the file + name is in UTF-8 code. + @param quality Sets the audio recording quality: #AUDIO_RECORDING_QUALITY_TYPE. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startAudioRecording(const char* filePath, + AUDIO_RECORDING_QUALITY_TYPE quality) = 0; + /** Starts an audio recording. + + The SDK allows recording during a call, which supports either one of the + following two formats: + + - .wav: Large file size with high sound fidelity + - .aac: Small file size with low sound fidelity + + Ensure that the directory to save the recording file exists and is writable. + This method is usually called after the joinChannel() method. + The recording automatically stops when the leaveChannel() method is + called. + + @param filePath Full file path of the recording file. The string of the file + name is in UTF-8 code. + @param sampleRate Sample rate, value should be 16000, 32000, 44100, or 48000. + @param quality Sets the audio recording quality: #AUDIO_RECORDING_QUALITY_TYPE. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startAudioRecording(const char* filePath, + int sampleRate, + AUDIO_RECORDING_QUALITY_TYPE quality) = 0; + + /** Starts an audio recording. + + The SDK allows recording during a call, which supports either one of the + following two formats: + + - .wav: Large file size with high sound fidelity + - .aac: Small file size with low sound fidelity + + Ensure that the directory to save the recording file exists and is writable. + This method is usually called after the joinChannel() method. + The recording automatically stops when the leaveChannel() method is + called. + + @param config Audio recording config. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startAudioRecording(const AudioRecordingConfiguration& config) = 0; + + /** register encoded audio frame observer + @return + - 0: Success. + - < 0: Failure. + */ + virtual int registerAudioEncodedFrameObserver(const AudioEncodedFrameObserverConfig& config, IAudioEncodedFrameObserver *observer) = 0; + + /** Stops the audio recording on the client. + + The recording automatically stops when the leaveChannel() method is called. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int stopAudioRecording() = 0; + + /** + * Creates a media player source object and return its pointer. If full featured + * media player source is supported, it will create it, or it will create a simple + * media player. + * + * @return + * - The pointer to \ref rtc::IMediaPlayerSource "IMediaPlayerSource", + * if the method call succeeds. + * - The empty pointer NULL, if the method call fails. + */ + virtual agora_refptr createMediaPlayer() = 0; + + /** + * Destroy a media player source instance. + * If a media player source instance is destroyed, the video and audio of it cannot + * be published. + * + * @param media_player The pointer to \ref rtc::IMediaPlayerSource. + * + * @return + * - >0: The id of media player source instance. + * - < 0: Failure. + */ + virtual int destroyMediaPlayer(agora_refptr media_player) = 0; + + /** + * Creates a media recorder object and return its pointer. + * + * @param info The RecorderStreamInfo object. It contains the user ID and the channel name. + * + * @return + * - The pointer to \ref rtc::IMediaRecorder "IMediaRecorder", + * if the method call succeeds. + * - The empty pointer NULL, if the method call fails. + */ + virtual agora_refptr createMediaRecorder(const RecorderStreamInfo& info) = 0; + + /** + * Destroy a media recorder object. + * + * @param mediaRecorder The pointer to \ref rtc::IMediaRecorder. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int destroyMediaRecorder(agora_refptr mediaRecorder) = 0; + + /** Starts playing and mixing the music file. + + This method mixes the specified local audio file with the audio stream from + the microphone. You can choose whether the other user can hear the local + audio playback and specify the number of playback loops. This method also + supports online music playback. + + After calling this method successfully, the SDK triggers the + + \ref IRtcEngineEventHandler::onAudioMixingStateChanged "onAudioMixingStateChanged" (PLAY) + callback on the local client. + When the audio mixing file playback finishes after calling this method, the + SDK triggers the + \ref IRtcEngineEventHandler::onAudioMixingStateChanged "onAudioMixingStateChanged" (STOPPED) + callback on the local client. + + @note + - Call this method after joining a channel, otherwise issues may occur. + - If the local audio mixing file does not exist, or if the SDK does not + support the file format or cannot access the music file URL, the SDK returns + #WARN_AUDIO_MIXING_OPEN_ERROR (701). + - If you want to play an online music file, ensure that the time interval + between calling this method is more than 100 ms, or the + `AUDIO_MIXING_ERROR_TOO_FREQUENT_CALL(702)` error code occurs. + + @param filePath Pointer to the absolute path (including the suffixes of the + filename) of the local or online audio file to mix, for example, c:/music/au + dio.mp4. Supported audio formats: 3GP, ASF, ADTS, AVI, MP3, MP4, MPEG-4, + SAMI, and WAVE. + @param loopback Sets which user can hear the audio mixing: + - true: Only the local user can hear the audio mixing. + - false: Both users can hear the audio mixing. + + @param cycle Sets the number of playback loops: + - Positive integer: Number of playback loops. + - `-1`: Infinite playback loops. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startAudioMixing(const char* filePath, bool loopback, int cycle) = 0; + + /** Starts playing and mixing the music file. + + This method mixes the specified local audio file with the audio stream from + the microphone. You can choose whether the other user can hear the local + audio playback and specify the number of playback loops. This method also + supports online music playback. + + After calling this method successfully, the SDK triggers the + + \ref IRtcEngineEventHandler::onAudioMixingStateChanged "onAudioMixingStateChanged" (PLAY) + callback on the local client. + When the audio mixing file playback finishes after calling this method, the + SDK triggers the + \ref IRtcEngineEventHandler::onAudioMixingStateChanged "onAudioMixingStateChanged" (STOPPED) + callback on the local client. + + @note + - Call this method after joining a channel, otherwise issues may occur. + - If the local audio mixing file does not exist, or if the SDK does not + support the file format or cannot access the music file URL, the SDK returns + #WARN_AUDIO_MIXING_OPEN_ERROR (701). + - If you want to play an online music file, ensure that the time interval + between calling this method is more than 100 ms, or the + `AUDIO_MIXING_ERROR_TOO_FREQUENT_CALL(702)` error code occurs. + + @param filePath Pointer to the absolute path (including the suffixes of the + filename) of the local or online audio file to mix, for example, c:/music/au + dio.mp4. Supported audio formats: 3GP, ASF, ADTS, AVI, MP3, MP4, MPEG-4, + SAMI, and WAVE. + @param loopback Sets which user can hear the audio mixing: + - true: Only the local user can hear the audio mixing. + - false: Both users can hear the audio mixing. + + @param cycle Sets the number of playback loops: + - Positive integer: Number of playback loops. + - `-1`: Infinite playback loops. + + @param startPos The playback position (ms) of the music file. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int startAudioMixing(const char* filePath, bool loopback, int cycle, int startPos) = 0; + + /** Stops playing and mixing the music file. + + Call this method when you are in a channel. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int stopAudioMixing() = 0; + + /** Pauses playing and mixing the music file. + + Call this method when you are in a channel. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int pauseAudioMixing() = 0; + + /** Resumes playing and mixing the music file. + + Call this method when you are in a channel. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int resumeAudioMixing() = 0; + + /** Select audio track for the music file. + + Call this method when you are in a channel. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int selectAudioTrack(int index) = 0; + /** Get audio track count of the music file. + + Call this method when you are in a channel. + + @return + - ≥ 0: Audio track count of the music file, if the method call is successful. + - < 0: Failure. + */ + virtual int getAudioTrackCount() = 0; + + /** Adjusts the volume during audio mixing. + + Call this method when you are in a channel. + + @note This method does not affect the volume of audio effect file playback + invoked by the \ref IRtcEngine::playEffect "playEffect" method. + + @param volume The audio mixing volume. The value ranges between 0 and 100 + (default). + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int adjustAudioMixingVolume(int volume) = 0; + + /** Adjusts the audio mixing volume for publishing (for remote users). + @note Call this method when you are in a channel. + @param volume Audio mixing volume for publishing. The value ranges between 0 and 100 (default). + @return + - 0: Success. + - < 0: Failure. + */ + virtual int adjustAudioMixingPublishVolume(int volume) = 0; + + /** Retrieves the audio mixing volume for publishing. + This method helps troubleshoot audio volume related issues. + @note Call this method when you are in a channel. + @return + - ≥ 0: The audio mixing volume for publishing, if this method call succeeds. The value range is [0,100]. + - < 0: Failure. + */ + virtual int getAudioMixingPublishVolume() = 0; + + /** Adjusts the audio mixing volume for local playback. + @note Call this method when you are in a channel. + @param volume Audio mixing volume for local playback. The value ranges between 0 and 100 (default). + @return + - 0: Success. + - < 0: Failure. + */ + virtual int adjustAudioMixingPlayoutVolume(int volume) = 0; + + /** Retrieves the audio mixing volume for local playback. + This method helps troubleshoot audio volume related issues. + @note Call this method when you are in a channel. + @return + - ≥ 0: The audio mixing volume, if this method call succeeds. The value range is [0,100]. + - < 0: Failure. + */ + virtual int getAudioMixingPlayoutVolume() = 0; + + /** Gets the duration (ms) of the music file. + + Call this API when you are in a channel. + + @return + - Returns the audio mixing duration, if the method call is successful. + - < 0: Failure. + */ + virtual int getAudioMixingDuration() = 0; + + /** Gets the playback position (ms) of the music file. + + Call this method when you are in a channel. + + @return + - ≥ 0: The current playback position of the audio mixing, if this method + call succeeds. + - < 0: Failure. + */ + virtual int getAudioMixingCurrentPosition() = 0; + + /** Sets the playback position of the music file to a different starting + position (the default plays from the beginning). + + @param pos The playback starting position (ms) of the audio mixing file. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setAudioMixingPosition(int pos /*in ms*/) = 0; + + /** In dual-channel music files, different audio data can be stored on the left and right channels. + * According to actual needs, you can set the channel mode as the original mode, + * the left channel mode, the right channel mode or the mixed mode + + @param mode The mode of channel mode + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setAudioMixingDualMonoMode(media::AUDIO_MIXING_DUAL_MONO_MODE mode) = 0; + + /** Sets the pitch of the local music file. + * + * When a local music file is mixed with a local human voice, call this method to set the pitch of the local music file only. + * + * @note Call this method after calling \ref IRtcEngine::startAudioMixing "startAudioMixing" and + * receiving the \ref IRtcEngineEventHandler::onAudioMixingStateChanged "onAudioMixingStateChanged" (AUDIO_MIXING_STATE_PLAYING) callback. + * + * @param pitch Sets the pitch of the local music file by chromatic scale. The default value is 0, + * which means keeping the original pitch. The value ranges from -12 to 12, and the pitch value between + * consecutive values is a chromatic value. The greater the absolute value of this parameter, the + * higher or lower the pitch of the local music file. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioMixingPitch(int pitch) = 0; + /** + * Gets the volume of audio effects. + * + * @return + * - ≥ 0: The volume of audio effects. The value ranges between 0 and 100 (original volume). + * - < 0: Failure. + */ + virtual int getEffectsVolume() = 0; + /** Sets the volume of audio effects. + * + * @param volume The volume of audio effects. The value ranges between 0 + * and 100 (original volume). + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setEffectsVolume(int volume) = 0; + /** Preloads a specified audio effect. + * + * This method preloads only one specified audio effect into the memory each time + * it is called. To preload multiple audio effects, call this method multiple times. + * + * After preloading, you can call \ref IRtcEngine::playEffect "playEffect" + * to play the preloaded audio effect or call + * \ref IRtcEngine::playAllEffects "playAllEffects" to play all the preloaded + * audio effects. + * + * @note + * - To ensure smooth communication, limit the size of the audio effect file. + * - Agora recommends calling this method before joining the channel. + * + * @param soundId The ID of the audio effect. + * @param filePath The absolute path of the local audio effect file or the URL + * of the online audio effect file. Supported audio formats: mp3, mp4, m4a, aac, + * 3gp, mkv, and wav. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int preloadEffect(int soundId, const char* filePath, int startPos = 0) = 0; + /** Plays a specified audio effect. + * + * After calling \ref IRtcEngine::preloadEffect "preloadEffect", you can call + * this method to play the specified audio effect for all users in + * the channel. + * + * This method plays only one specified audio effect each time it is called. + * To play multiple audio effects, call this method multiple times. + * + * @note + * - Agora recommends playing no more than three audio effects at the same time. + * - The ID and file path of the audio effect in this method must be the same + * as that in the \ref IRtcEngine::preloadEffect "preloadEffect" method. + * + * @param soundId The ID of the audio effect. + * @param filePath The absolute path of the local audio effect file or the URL + * of the online audio effect file. Supported audio formats: mp3, mp4, m4a, aac, + * 3gp, mkv, and wav. + * @param loopCount The number of times the audio effect loops: + * - `-1`: Play the audio effect in an indefinite loop until + * \ref IRtcEngine::stopEffect "stopEffect" or + * \ref IRtcEngine::stopAllEffects "stopAllEffects" + * - `0`: Play the audio effect once. + * - `1`: Play the audio effect twice. + * @param pitch The pitch of the audio effect. The value ranges between 0.5 and 2.0. + * The default value is `1.0` (original pitch). The lower the value, the lower the pitch. + * @param pan The spatial position of the audio effect. The value ranges between -1.0 and 1.0: + * - `-1.0`: The audio effect displays to the left. + * - `0.0`: The audio effect displays ahead. + * - `1.0`: The audio effect displays to the right. + * @param gain The volume of the audio effect. The value ranges between 0 and 100. + * The default value is `100` (original volume). The lower the value, the lower + * the volume of the audio effect. + * @param publish Sets whether to publish the audio effect to the remote: + * - true: Publish the audio effect to the remote. + * - false: (Default) Do not publish the audio effect to the remote. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int playEffect(int soundId, const char* filePath, int loopCount, double pitch, double pan, int gain, bool publish = false, int startPos = 0) = 0; + /** Plays all audio effects. + * + * After calling \ref IRtcEngine::preloadEffect "preloadEffect" multiple times + * to preload multiple audio effects into the memory, you can call this + * method to play all the specified audio effects for all users in + * the channel. + * + * @param loopCount The number of times the audio effect loops: + * - `-1`: Play the audio effect in an indefinite loop until + * \ref IRtcEngine::stopEffect "stopEffect" or + * \ref IRtcEngine::stopAllEffects "stopAllEffects" + * - `0`: Play the audio effect once. + * - `1`: Play the audio effect twice. + * @param pitch The pitch of the audio effect. The value ranges between 0.5 and 2.0. + * The default value is `1.0` (original pitch). The lower the value, the lower the pitch. + * @param pan The spatial position of the audio effect. The value ranges between -1.0 and 1.0: + * - `-1.0`: The audio effect displays to the left. + * - `0.0`: The audio effect displays ahead. + * - `1.0`: The audio effect displays to the right. + * @param gain The volume of the audio effect. The value ranges between 0 and 100. + * The default value is `100` (original volume). The lower the value, the lower + * the volume of the audio effect. + * @param publish Sets whether to publish the audio effect to the remote: + * - true: Publish the audio effect to the remote. + * - false: (Default) Do not publish the audio effect to the remote. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int playAllEffects(int loopCount, double pitch, double pan, int gain, bool publish = false) = 0; + + /** Gets the volume of the specified audio effect. + * + * @param soundId The ID of the audio effect. + * + * @return + * - ≥ 0: The volume of the specified audio effect. The value ranges + * between 0 and 100 (original volume). + * - < 0: Failure. + */ + virtual int getVolumeOfEffect(int soundId) = 0; + + /** Sets the volume of the specified audio effect. + * + * @param soundId The ID of the audio effect. + * @param volume The volume of the specified audio effect. The value ranges + * between 0 and 100 (original volume). + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVolumeOfEffect(int soundId, int volume) = 0; + /** Pauses playing the specified audio effect. + * + * @param soundId The ID of the audio effect. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int pauseEffect(int soundId) = 0; + /** Pauses playing audio effects. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int pauseAllEffects() = 0; + /** Resumes playing the specified audio effect. + * + * @param soundId The ID of the audio effect. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int resumeEffect(int soundId) = 0; + /** Resumes playing audio effects. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int resumeAllEffects() = 0; + /** Stops playing the specified audio effect. + * + * @param soundId The ID of the audio effect. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopEffect(int soundId) = 0; + /** Stops playing audio effects. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopAllEffects() = 0; + /** Releases the specified preloaded audio effect from the memory. + * + * @param soundId The ID of the audio effect. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unloadEffect(int soundId) = 0; + /** Releases preloaded audio effects from the memory. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unloadAllEffects() = 0; + /** + * Gets the duration of the audio effect file. + * @note + * - Call this method after joining a channel. + * - For the audio file formats supported by this method, see [What formats of audio files does the Agora RTC SDK support](https://docs.agora.io/en/faq/audio_format). + * + * @param filePath The absolute path or URL address (including the filename extensions) + * of the music file. For example: `C:\music\audio.mp4`. + * When you access a local file on Android, Agora recommends passing a URI address or the path starts + * with `/assets/` in this parameter. + * + * @return + * - ≥ 0: A successful method call. Returns the total duration (ms) of + * the specified audio effect file. + * - < 0: Failure. + * - `-22(ERR_RESOURCE_LIMITED)`: Cannot find the audio effect file. Please + * set a correct `filePath`. + */ + virtual int getEffectDuration(const char* filePath) = 0; + /** + * Sets the playback position of an audio effect file. + * After a successful setting, the local audio effect file starts playing at the specified position. + * + * @note Call this method after \ref IRtcEngine::playEffect(int,const char*,int,double,double,int,bool,int) "playEffect" . + * + * @param soundId Audio effect ID. Ensure that this parameter is set to the + * same value as in \ref IRtcEngine::playEffect(int,const char*,int,double,double,int,bool,int) "playEffect" . + * @param pos The playback position (ms) of the audio effect file. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - `-22(ERR_RESOURCE_LIMITED)`: Cannot find the audio effect file. Please + * set a correct `soundId`. + */ + virtual int setEffectPosition(int soundId, int pos) = 0; + /** + * Gets the playback position of the audio effect file. + * @note Call this method after \ref IRtcEngine::playEffect(int,const char*,int,double,double,int,bool,int) "playEffect" . + * + * @param soundId Audio effect ID. Ensure that this parameter is set to the + * same value as in \ref IRtcEngine::playEffect(int,const char*,int,double,double,int,bool,int) "playEffect" . + * + * @return + * - ≥ 0: A successful method call. Returns the playback position (ms) of + * the specified audio effect file. + * - < 0: Failure. + * - `-22(ERR_RESOURCE_LIMITED)`: Cannot find the audio effect file. Please + * set a correct `soundId`. + */ + virtual int getEffectCurrentPosition(int soundId) = 0; + /** Enables/Disables stereo panning for remote users. + + Ensure that you call this method before joinChannel to enable stereo panning for remote users so that the local user can track the position of a remote user by calling \ref agora::rtc::IRtcEngine::setRemoteVoicePosition "setRemoteVoicePosition". + + @param enabled Sets whether or not to enable stereo panning for remote users: + - true: enables stereo panning. + - false: disables stereo panning. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int enableSoundPositionIndication(bool enabled) = 0; + + /** Sets the sound position and gain of a remote user. + + When the local user calls this method to set the sound position of a remote user, the sound difference between the left and right channels allows the local user to track the real-time position of the remote user, creating a real sense of space. This method applies to massively multiplayer online games, such as Battle Royale games. + + @note + - For this method to work, enable stereo panning for remote users by calling the \ref agora::rtc::IRtcEngine::enableSoundPositionIndication "enableSoundPositionIndication" method before joining a channel. + - This method requires hardware support. For the best sound positioning, we recommend using a wired headset. + - Ensure that you call this method after joining a channel. + + @param uid The ID of the remote user. + @param pan The sound position of the remote user. The value ranges from -1.0 to 1.0: + - 0.0: the remote sound comes from the front. + - -1.0: the remote sound comes from the left. + - 1.0: the remote sound comes from the right. + @param gain Gain of the remote user. The value ranges from 0.0 to 100.0. The default value is 100.0 (the original gain of the remote user). The smaller the value, the less the gain. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setRemoteVoicePosition(uid_t uid, double pan, double gain) = 0; + + /** enable spatial audio + + @param enabled enable/disable spatial audio: + - true: enable spatial audio. + - false: disable spatial audio. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int enableSpatialAudio(bool enabled) = 0; + + /** Sets remote user parameters for spatial audio + + @param uid The ID of the remote user. + @param param spatial audio parameters: SpatialAudioParams. + + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int setRemoteUserSpatialAudioParams(uid_t uid, const agora::SpatialAudioParams& params) = 0; + + /** Sets an SDK preset voice beautifier effect. + * + * Call this method to set an SDK preset voice beautifier effect for the local user who sends an + * audio stream. After setting a voice beautifier effect, all users in the channel can hear the + * effect. + * + * You can set different voice beautifier effects for different scenarios. See *Set the Voice + * Beautifier and Audio Effects*. + * + * To achieve better audio effect quality, Agora recommends calling \ref + * IRtcEngine::setAudioProfile "setAudioProfile" and setting the `scenario` parameter to + * `AUDIO_SCENARIO_GAME_STREAMING(3)` and the `profile` parameter to + * `AUDIO_PROFILE_MUSIC_HIGH_QUALITY(4)` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)` before + * calling this method. + * + * @note + * - You can call this method either before or after joining a channel. + * - Do not set the `profile` parameter of \ref IRtcEngine::setAudioProfile "setAudioProfile" to + * `AUDIO_PROFILE_SPEECH_STANDARD(1)` or `AUDIO_PROFILE_IOT(6)`; otherwise, this method call + * fails. + * - This method works best with the human voice. Agora does not recommend using this method for + * audio containing music. + * - After calling this method, Agora recommends not calling the following methods, because they + * can override \ref IRtcEngine::setAudioEffectParameters "setAudioEffectParameters": + * - \ref IRtcEngine::setAudioEffectPreset "setAudioEffectPreset" + * - \ref IRtcEngine::setVoiceBeautifierPreset "setVoiceBeautifierPreset" + * - \ref IRtcEngine::setLocalVoicePitch "setLocalVoicePitch" + * - \ref IRtcEngine::setLocalVoiceEqualization "setLocalVoiceEqualization" + * - \ref IRtcEngine::setLocalVoiceReverb "setLocalVoiceReverb" + * - \ref IRtcEngine::setVoiceBeautifierParameters "setVoiceBeautifierParameters" + * + * @param preset The options for SDK preset voice beautifier effects: #VOICE_BEAUTIFIER_PRESET. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVoiceBeautifierPreset(VOICE_BEAUTIFIER_PRESET preset) = 0; + + /** Sets an SDK preset audio effect. + * + * Call this method to set an SDK preset audio effect for the local user who sends an audio + * stream. This audio effect does not change the gender characteristics of the original voice. + * After setting an audio effect, all users in the channel can hear the effect. + * + * You can set different audio effects for different scenarios. See *Set the Voice Beautifier and + * Audio Effects*. + * + * To achieve better audio effect quality, Agora recommends calling \ref + * IRtcEngine::setAudioProfile "setAudioProfile" and setting the `scenario` parameter to + * `AUDIO_SCENARIO_GAME_STREAMING(3)` before calling this method. + * + * @note + * - You can call this method either before or after joining a channel. + * - Do not set the profile `parameter` of `setAudioProfile` to `AUDIO_PROFILE_SPEECH_STANDARD(1)` + * or `AUDIO_PROFILE_IOT(6)`; otherwise, this method call fails. + * - This method works best with the human voice. Agora does not recommend using this method for + * audio containing music. + * - If you call this method and set the `preset` parameter to enumerators except + * `ROOM_ACOUSTICS_3D_VOICE` or `PITCH_CORRECTION`, do not call \ref + * IRtcEngine::setAudioEffectParameters "setAudioEffectParameters"; otherwise, + * `setAudioEffectParameters` overrides this method. + * - After calling this method, Agora recommends not calling the following methods, because they + * can override `setAudioEffectPreset`: + * - \ref IRtcEngine::setVoiceBeautifierPreset "setVoiceBeautifierPreset" + * - \ref IRtcEngine::setLocalVoicePitch "setLocalVoicePitch" + * - \ref IRtcEngine::setLocalVoiceEqualization "setLocalVoiceEqualization" + * - \ref IRtcEngine::setLocalVoiceReverb "setLocalVoiceReverb" + * - \ref IRtcEngine::setVoiceBeautifierParameters "setVoiceBeautifierParameters" + * + * @param preset The options for SDK preset audio effects. See #AUDIO_EFFECT_PRESET. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioEffectPreset(AUDIO_EFFECT_PRESET preset) = 0; + + /** Sets an SDK preset voice conversion. + * + * Call this method to set an SDK preset voice conversion for the local user who sends an audio + * stream. After setting an voice conversion, all users in the channel can hear the effect. + * + * You can set different voice conversion for different scenarios. See *Set the Voice Beautifier and + * Audio Effects*. + * + * To achieve better voice conversion quality, Agora recommends calling \ref + * IRtcEngine::setAudioProfile "setAudioProfile" and setting the `scenario` parameter to + * `AUDIO_SCENARIO_GAME_STREAMING(3)` before calling this method. + * + * @note + * - You can call this method either before or after joining a channel. + * - Do not set the profile `parameter` of `setAudioProfile` to `AUDIO_PROFILE_SPEECH_STANDARD(1)` + * or `AUDIO_PROFILE_IOT(6)`; otherwise, this method call fails. + * - This method works best with the human voice. Agora does not recommend using this method for + * audio containing music. + * - If you call this method and set the `preset` parameter to enumerators, + * - After calling this method, Agora recommends not calling the following methods, because they + * can override `setVoiceConversionPreset`: + * - \ref IRtcEngine::setVoiceBeautifierPreset "setVoiceBeautifierPreset" + * - \ref IRtcEngine::setAudioEffectPreset "setAudioEffectPreset" + * - \ref IRtcEngine::setLocalVoicePitch "setLocalVoicePitch" + * - \ref IRtcEngine::setLocalVoiceFormant "setLocalVoiceFormant" + * - \ref IRtcEngine::setLocalVoiceEqualization "setLocalVoiceEqualization" + * - \ref IRtcEngine::setLocalVoiceReverb "setLocalVoiceReverb" + * - \ref IRtcEngine::setVoiceBeautifierParameters "setVoiceBeautifierParameters" + * - \ref IRtcEngine::setAudioEffectParameters "setAudioEffectParameters" + * + * @param preset The options for SDK preset voice conversion. See #VOICE_CONVERSION_PRESET. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVoiceConversionPreset(VOICE_CONVERSION_PRESET preset) = 0; + + /** Sets parameters for SDK preset audio effects. + * + * Call this method to set the following parameters for the local user who send an audio stream: + * - 3D voice effect: Sets the cycle period of the 3D voice effect. + * - Pitch correction effect: Sets the basic mode and tonic pitch of the pitch correction effect. + * Different songs have different modes and tonic pitches. Agora recommends bounding this method + * with interface elements to enable users to adjust the pitch correction interactively. + * + * After setting parameters, all users in the channel can hear the relevant effect. + * + * You can call this method directly or after \ref IRtcEngine::setAudioEffectPreset + * "setAudioEffectPreset". If you call this method after \ref IRtcEngine::setAudioEffectPreset + * "setAudioEffectPreset", ensure that you set the preset parameter of `setAudioEffectPreset` to + * `ROOM_ACOUSTICS_3D_VOICE` or `PITCH_CORRECTION` and then call this method to set the same + * enumerator; otherwise, this method overrides `setAudioEffectPreset`. + * + * @note + * - You can call this method either before or after joining a channel. + * - To achieve better audio effect quality, Agora recommends calling \ref + * IRtcEngine::setAudioProfile "setAudioProfile" and setting the `scenario` parameter to + * `AUDIO_SCENARIO_GAME_STREAMING(3)` before calling this method. + * - Do not set the `profile` parameter of \ref IRtcEngine::setAudioProfile "setAudioProfile" to + * `AUDIO_PROFILE_SPEECH_STANDARD(1)` or `AUDIO_PROFILE_IOT(6)`; otherwise, this method call + * fails. + * - This method works best with the human voice. Agora does not recommend using this method for + * audio containing music. + * - After calling this method, Agora recommends not calling the following methods, because they + * can override `setAudioEffectParameters`: + * - \ref IRtcEngine::setAudioEffectPreset "setAudioEffectPreset" + * - \ref IRtcEngine::setVoiceBeautifierPreset "setVoiceBeautifierPreset" + * - \ref IRtcEngine::setLocalVoicePitch "setLocalVoicePitch" + * - \ref IRtcEngine::setLocalVoiceEqualization "setLocalVoiceEqualization" + * - \ref IRtcEngine::setLocalVoiceReverb "setLocalVoiceReverb" + * - \ref IRtcEngine::setVoiceBeautifierParameters "setVoiceBeautifierParameters" + * @param preset The options for SDK preset audio effects: + * - 3D voice effect: `ROOM_ACOUSTICS_3D_VOICE`. + * - Call \ref IRtcEngine::setAudioProfile "setAudioProfile" and set the `profile` parameter to + * `AUDIO_PROFILE_MUSIC_STANDARD_STEREO(3)` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)` before + * setting this enumerator; otherwise, the enumerator setting does not take effect. + * - If the 3D voice effect is enabled, users need to use stereo audio playback devices to hear + * the anticipated voice effect. + * - Pitch correction effect: `PITCH_CORRECTION`. To achieve better audio effect quality, Agora + * recommends calling \ref IRtcEngine::setAudioProfile "setAudioProfile" and setting the `profile` + * parameter to `AUDIO_PROFILE_MUSIC_HIGH_QUALITY(4)` or + * `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)` before setting this enumerator. + * @param param1 + * - If you set `preset` to `ROOM_ACOUSTICS_3D_VOICE`, the `param1` sets the cycle period of the + * 3D voice effect. The value range is [1,60] and the unit is a second. The default value is 10 + * seconds, indicating that the voice moves around you every 10 seconds. + * - If you set `preset` to `PITCH_CORRECTION`, `param1` sets the basic mode of the pitch + * correction effect: + * - `1`: (Default) Natural major scale. + * - `2`: Natural minor scale. + * - `3`: Japanese pentatonic scale. + * @param param2 + * - If you set `preset` to `ROOM_ACOUSTICS_3D_VOICE`, you need to set `param2` to `0`. + * - If you set `preset` to `PITCH_CORRECTION`, `param2` sets the tonic pitch of the pitch + * correction effect: + * - `1`: A + * - `2`: A# + * - `3`: B + * - `4`: (Default) C + * - `5`: C# + * - `6`: D + * - `7`: D# + * - `8`: E + * - `9`: F + * - `10`: F# + * - `11`: G + * - `12`: G# + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioEffectParameters(AUDIO_EFFECT_PRESET preset, int param1, int param2) = 0; + + /** Sets parameters for SDK preset voice beautifier effects. + * + * Call this method to set a gender characteristic and a reverberation effect for the singing + * beautifier effect. This method sets parameters for the local user who sends an audio stream. + * + * After you call this method successfully, all users in the channel can hear the relevant effect. + * + * To achieve better audio effect quality, before you call this method, Agora recommends calling + * \ref IRtcEngine::setAudioProfile "setAudioProfile", and setting the `scenario` parameter as + * `AUDIO_SCENARIO_GAME_STREAMING(3)` and the `profile` parameter as + * `AUDIO_PROFILE_MUSIC_HIGH_QUALITY(4)` or `AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5)`. + * + * @note + * - You can call this method either before or after joining a channel. + * - Do not set the `profile` parameter of \ref IRtcEngine::setAudioProfile "setAudioProfile" as + * `AUDIO_PROFILE_SPEECH_STANDARD(1)` or `AUDIO_PROFILE_IOT(6)`; otherwise, this method call does + * not take effect. + * - This method works best with the human voice. Agora does not recommend using this method for + * audio containing music. + * - After you call this method, Agora recommends not calling the following methods, because they + * can override `setVoiceBeautifierParameters`: + * - \ref IRtcEngine::setAudioEffectPreset "setAudioEffectPreset" + * - \ref IRtcEngine::setAudioEffectParameters "setAudioEffectParameters" + * - \ref IRtcEngine::setVoiceBeautifierPreset "setVoiceBeautifierPreset" + * - \ref IRtcEngine::setLocalVoicePitch "setLocalVoicePitch" + * - \ref IRtcEngine::setLocalVoiceEqualization "setLocalVoiceEqualization" + * - \ref IRtcEngine::setLocalVoiceReverb "setLocalVoiceReverb" + * + * @param preset The options for SDK preset voice beautifier effects: + * - `SINGING_BEAUTIFIER`: Singing beautifier effect. + * @param param1 The gender characteristics options for the singing voice: + * - `1`: A male-sounding voice. + * - `2`: A female-sounding voice. + * @param param2 The reverberation effects options: + * - `1`: The reverberation effect sounds like singing in a small room. + * - `2`: The reverberation effect sounds like singing in a large room. + * - `3`: The reverberation effect sounds like singing in a hall. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVoiceBeautifierParameters(VOICE_BEAUTIFIER_PRESET preset, + int param1, int param2) = 0; + + /** Set parameters for SDK preset voice conversion. + * + * @note + * - reserved interface + * + * @param preset The options for SDK preset audio effects. See #VOICE_CONVERSION_PRESET. + * @param param1 reserved. + * @param param2 reserved. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVoiceConversionParameters(VOICE_CONVERSION_PRESET preset, + int param1, int param2) = 0; + + /** Changes the voice pitch of the local speaker. + + @param pitch The voice pitch. The value ranges between 0.5 and 2.0. The lower + the value, the lower the voice pitch. The default value is 1.0 (no change to + the local voice pitch). + + @return + - 0: Success. + - -1: Failure. + */ + virtual int setLocalVoicePitch(double pitch) = 0; + + /** Changes the voice formant ratio for local speaker. + + @param formantRatio The voice formant ratio. The value ranges between -1.0 and 1.0. + The lower the value, the deeper the sound, and the higher the value, the more it + sounds like a child. The default value is 0.0 (the local user's voice will not be changed). + + @return + - 0: Success. + - -1: Failure. + */ + virtual int setLocalVoiceFormant(double formantRatio) = 0; + + /** Sets the local voice equalization effect. + + @param bandFrequency The band frequency ranging from 0 to 9, representing the + respective 10-band center frequencies of the voice effects, including 31, 62, + 125, 500, 1k, 2k, 4k, 8k, and 16k Hz. + @param bandGain Gain of each band in dB. The value ranges from -15 to 15. The + default value is 0. + @return + - 0: Success. + - -1: Failure. + */ + virtual int setLocalVoiceEqualization(AUDIO_EQUALIZATION_BAND_FREQUENCY bandFrequency, int bandGain) = 0; + + /** Sets the local voice reverberation. + + @param reverbKey The reverberation key: #AUDIO_REVERB_TYPE. + @param value The value of the reverberation key: #AUDIO_REVERB_TYPE. + @return + - 0: Success. + - -1: Failure. + */ + virtual int setLocalVoiceReverb(AUDIO_REVERB_TYPE reverbKey, int value) = 0; + /** Sets preset audio playback effect for remote headphones after remote audio is mixed. + + @param preset The preset key: #HEADPHONE_EQUALIZER_PRESET. + - HEADPHONE_EQUALIZER_OFF = 0x00000000 : Turn off the eualizer effect for headphones. + - HEADPHONE_EQUALIZER_OVEREAR = 0x04000001 : For over-ear headphones only. + - HEADPHONE_EQUALIZER_INEAR = 0x04000002 : For in-ear headphones only. + @return + - 0: Success. + - < 0: Failure. + - -1(ERR_FAILED): A general error occurs (no specified reason). + */ + virtual int setHeadphoneEQPreset(HEADPHONE_EQUALIZER_PRESET preset) = 0; + + /** Sets the parameters of audio playback effect for remote headphones after remote audio is mixed. + + @param lowGain The higher the parameter value, the deeper the sound. The value range is [-10,10]. + @param highGain The higher the parameter value, the sharper the sound. The value range is [-10,10]. + @return + - 0: Success. + - < 0: Failure. + - -1(ERR_FAILED): A general error occurs (no specified reason). + */ + virtual int setHeadphoneEQParameters(int lowGain, int highGain) = 0; + + /** **DEPRECATED** Specifies an SDK output log file. + * + * The log file records all log data for the SDK鈥檚 operation. Ensure that the + * directory for the log file exists and is writable. + * + * @note + * Ensure that you call this method immediately after \ref initialize "initialize", + * or the output log may not be complete. + * + * @param filePath File path of the log file. The string of the log file is in UTF-8. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLogFile(const char* filePath) = 0; + + /** + * Sets the output log filter level of the SDK. + * + * You can use one or a combination of the filters. The log filter level follows the + * sequence of `OFF`, `CRITICAL`, `ERROR`, `WARNING`, `INFO`, and `DEBUG`. Choose a filter level + * and you will see logs preceding that filter level. For example, if you set the log filter level to + * `WARNING`, you see the logs within levels `CRITICAL`, `ERROR`, and `WARNING`. + * + * @param filter The log filter level: + * - `LOG_FILTER_DEBUG(0x80f)`: Output all API logs. Set your log filter as DEBUG + * if you want to get the most complete log file. + * - `LOG_FILTER_INFO(0x0f)`: Output logs of the CRITICAL, ERROR, WARNING, and INFO + * level. We recommend setting your log filter as this level. + * - `LOG_FILTER_WARNING(0x0e)`: Output logs of the CRITICAL, ERROR, and WARNING level. + * - `LOG_FILTER_ERROR(0x0c)`: Output logs of the CRITICAL and ERROR level. + * - `LOG_FILTER_CRITICAL(0x08)`: Output logs of the CRITICAL level. + * - `LOG_FILTER_OFF(0)`: Do not output any log. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLogFilter(unsigned int filter) = 0; + + /** + * Sets the output log level of the SDK. + * + * You can set the SDK to ouput the log files of the specified level. + * + * @param level The log level: + * - `LOG_LEVEL_NONE (0x0000)`: Do not output any log file. + * - `LOG_LEVEL_INFO (0x0001)`: (Recommended) Output log files of the INFO level. + * - `LOG_LEVEL_WARN (0x0002)`: Output log files of the WARN level. + * - `LOG_LEVEL_ERROR (0x0004)`: Output log files of the ERROR level. + * - `LOG_LEVEL_FATAL (0x0008)`: Output log files of the FATAL level. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLogLevel(commons::LOG_LEVEL level) = 0; + + /** + * Sets the log file size (KB). + * + * The SDK has two log files, each with a default size of 512 KB. If you set + * `fileSizeInBytes` as 1024 KB, the SDK outputs log files with a total + * maximum size of 2 MB. + * If the total size of the log files exceed the set value, + * the new output log files overwrite the old output log files. + * + * @param fileSizeInKBytes The SDK log file size (KB). + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLogFileSize(unsigned int fileSizeInKBytes) = 0; + + /** Upload current log file immediately to server. + * only use this when an error occurs + * block before log file upload success or timeout. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int uploadLogFile(agora::util::AString& requestId) = 0; + + /** + * Updates the display mode of the local video view. + * + * After initializing the local video view, you can call this method to update its rendering mode. + * It affects only the video view that the local user sees, not the published local video stream. + * + * @note + * - Ensure that you have called \ref setupLocalVideo "setupLocalVideo" to initialize the local video + * view before this method. + * - During a call, you can call this method as many times as necessary to update the local video view. + * + * @param renderMode Sets the local display mode. See #RENDER_MODE_TYPE. + * @param mirrorMode Sets the local mirror mode. See #VIDEO_MIRROR_MODE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLocalRenderMode(media::base::RENDER_MODE_TYPE renderMode, VIDEO_MIRROR_MODE_TYPE mirrorMode) = 0; + + /** + * Updates the display mode of the video view of a remote user. + * + * After initializing the video view of a remote user, you can call this method to update its + * rendering and mirror modes. This method affects only the video view that the local user sees. + * + * @note + * - Ensure that you have called \ref setupRemoteVideo "setupRemoteVideo" to initialize the remote video + * view before calling this method. + * - During a call, you can call this method as many times as necessary to update the display mode + * of the video view of a remote user. + * + * @param uid ID of the remote user. + * @param renderMode Sets the remote display mode. See #RENDER_MODE_TYPE. + * @param mirrorMode Sets the mirror type. See #VIDEO_MIRROR_MODE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteRenderMode(uid_t uid, media::base::RENDER_MODE_TYPE renderMode, + VIDEO_MIRROR_MODE_TYPE mirrorMode) = 0; + + // The following APIs are either deprecated and going to deleted. + + /** + * Updates the display mode of the local video view. + * + * After initializing the local video view, you can call this method to update its rendering mode. + * It affects only the video view that the local user sees, not the published local video stream. + * + * @note + * - Ensure that you have called \ref setupLocalVideo "setupLocalVideo" to initialize the local video + * view before this method. + * - During a call, you can call this method as many times as necessary to update the local video view. + * + * @param renderMode Sets the local display mode. See #RENDER_MODE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLocalRenderMode(media::base::RENDER_MODE_TYPE renderMode) = 0; + + /** + * Sets the local video mirror mode. + * + * Use this method before calling the \ref startPreview "startPreview" method, or the mirror mode + * does not take effect until you call the `startPreview` method again. + * @param mirrorMode Sets the local video mirror mode. See #VIDEO_MIRROR_MODE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLocalVideoMirrorMode(VIDEO_MIRROR_MODE_TYPE mirrorMode) = 0; + + /** + * Enables or disables the dual video stream mode. + * + * If dual-stream mode is enabled, the subscriber can choose to receive the high-stream + * (high-resolution high-bitrate video stream) or low-stream (low-resolution low-bitrate video stream) + * video using \ref setRemoteVideoStreamType "setRemoteVideoStreamType". + * + * @param enabled + * - true: Enable the dual-stream mode. + * - false: (default) Disable the dual-stream mode. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableDualStreamMode(bool enabled) __deprecated = 0; + + /** + * Enables or disables the dual video stream mode. + * + * If dual-stream mode is enabled, the subscriber can choose to receive the high-stream + * (high-resolution high-bitrate video stream) or low-stream (low-resolution low-bitrate video stream) + * video using \ref setRemoteVideoStreamType "setRemoteVideoStreamType". + * + * @param enabled + * - true: Enable the dual-stream mode. + * - false: (default) Disable the dual-stream mode. + * @param streamConfig + * - The minor stream config + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableDualStreamMode(bool enabled, const SimulcastStreamConfig& streamConfig) __deprecated = 0; + + + /** + * Enables, disables or auto enable the dual video stream mode. + * + * If dual-stream mode is enabled, the subscriber can choose to receive the high-stream + * (high-resolution high-bitrate video stream) or low-stream (low-resolution low-bitrate video stream) + * video using \ref setRemoteVideoStreamType "setRemoteVideoStreamType". + * + * @param mode + * - The dual stream mode + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDualStreamMode(SIMULCAST_STREAM_MODE mode) = 0; + + /** + * Enables, disables or auto enable the dual video stream mode. + * + * If dual-stream mode is enabled, the subscriber can choose to receive the high-stream + * (high-resolution high-bitrate video stream) or low-stream (low-resolution low-bitrate video stream) + * video using \ref setRemoteVideoStreamType "setRemoteVideoStreamType". + * + * @param mode Dual stream mode: #SIMULCAST_STREAM_MODE. + * @param streamConfig Configurations of the low stream: SimulcastStreamConfig. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDualStreamMode(SIMULCAST_STREAM_MODE mode, const SimulcastStreamConfig& streamConfig) = 0; + + /** + * Sets the external audio track. + * + * @note + * Ensure that you call this method before joining the channel. + * + * @param trackId custom audio track id. + * @param enabled Determines whether to local playback the external audio track: + * - true: Local playback the external audio track. + * - false: Local don`t playback the external audio track. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableCustomAudioLocalPlayback(track_id_t trackId, bool enabled) = 0; + + /** + * Sets the audio recording format for the + * \ref agora::media::IAudioFrameObserver::onRecordAudioFrame "onRecordAudioFrame" callback. + * + * @param sampleRate The sample rate (Hz) of the audio data returned in the `onRecordAudioFrame` callback, which can set be + * as 8000, 16000, 32000, 44100, or 48000. + * @param channel The number of audio channels of the audio data returned in the `onRecordAudioFrame` callback, which can + * be set as 1 or 2: + * - 1: Mono. + * - 2: Stereo. + * @param mode This mode is deprecated. + * @param samplesPerCall not support. Sampling points in the called data returned in + * onRecordAudioFrame(). For example, it is usually set as 1024 for stream + * pushing. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRecordingAudioFrameParameters(int sampleRate, int channel, + RAW_AUDIO_FRAME_OP_MODE_TYPE mode, + int samplesPerCall) = 0; + + /** + * Sets the audio playback format for the + * \ref agora::media::IAudioFrameObserver::onPlaybackAudioFrame "onPlaybackAudioFrame" callback. + * + * @param sampleRate Sets the sample rate (Hz) of the audio data returned in the `onPlaybackAudioFrame` callback, + * which can set be as 8000, 16000, 32000, 44100, or 48000. + * @param channel The number of channels of the audio data returned in the `onPlaybackAudioFrame` callback, which + * can be set as 1 or 2: + * - 1: Mono + * - 2: Stereo + * @param mode Deprecated. The use mode of the onPlaybackAudioFrame() callback: + * agora::rtc::RAW_AUDIO_FRAME_OP_MODE_TYPE. + * @param samplesPerCall not support. Sampling points in the called data returned in + * onPlaybackAudioFrame(). For example, it is usually set as 1024 for stream + * pushing. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlaybackAudioFrameParameters(int sampleRate, int channel, + RAW_AUDIO_FRAME_OP_MODE_TYPE mode, + int samplesPerCall) = 0; + + /** + * Sets the mixed audio format for the + * \ref agora::media::IAudioFrameObserver::onMixedAudioFrame "onMixedAudioFrame" callback. + * + * @param sampleRate The sample rate (Hz) of the audio data returned in the `onMixedAudioFrame` callback, which can set + * be as 8000, 16000, 32000, 44100, or 48000. + * @param channel The number of channels of the audio data in `onMixedAudioFrame` callback, which can be set as 1 or 2: + * - 1: Mono + * - 2: Stereo + * @param samplesPerCall not support. Sampling points in the called data returned in + * `onMixedAudioFrame`. For example, it is usually set as 1024 for stream pushing. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setMixedAudioFrameParameters(int sampleRate, int channel, int samplesPerCall) = 0; + + /** + * Sets the audio ear monitoring format for the + * \ref agora::media::IAudioFrameObserver::onEarMonitoringAudioFrame "onEarMonitoringAudioFrame" callback. + * + * @param sampleRate Sets the sample rate (Hz) of the audio data returned in the `onEarMonitoringAudioFrame` callback, + * which can set be as 8000, 16000, 32000, 44100, or 48000. + * @param channel The number of channels of the audio data returned in the `onEarMonitoringAudioFrame` callback, which + * can be set as 1 or 2: + * - 1: Mono + * - 2: Stereo + * @param mode Deprecated. The use mode of the onEarMonitoringAudioFrame() callback: + * agora::rtc::RAW_AUDIO_FRAME_OP_MODE_TYPE. + * @param samplesPerCall not support. Sampling points in the called data returned in + * onEarMonitoringAudioFrame(). For example, it is usually set as 1024 for stream + * pushing. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setEarMonitoringAudioFrameParameters(int sampleRate, int channel, + RAW_AUDIO_FRAME_OP_MODE_TYPE mode, + int samplesPerCall) = 0; + + /** + * Sets the audio playback format before mixing in the + * \ref agora::media::IAudioFrameObserver::onPlaybackAudioFrameBeforeMixing "onPlaybackAudioFrameBeforeMixing" + * callback. + * + * @param sampleRate The sample rate (Hz) of the audio data returned in + * `onPlaybackAudioFrameBeforeMixing`, which can set be as 8000, 16000, 32000, 44100, or 48000. + * @param channel Number of channels of the audio data returned in `onPlaybackAudioFrameBeforeMixing`, + * which can be set as 1 or 2: + * - 1: Mono + * - 2: Stereo + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlaybackAudioFrameBeforeMixingParameters(int sampleRate, int channel) = 0; + + /** + * Enable the audio spectrum monitor. + * + * @param intervalInMS Sets the time interval(ms) between two consecutive audio spectrum callback. + * The default value is 100. This param should be larger than 10. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAudioSpectrumMonitor(int intervalInMS = 100) = 0; + /** + * Disalbe the audio spectrum monitor. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int disableAudioSpectrumMonitor() = 0; + + /** + * Registers an audio spectrum observer. + * + * You need to implement the `IAudioSpectrumObserver` class in this method, and register the following callbacks + * according to your scenario: + * - \ref agora::media::IAudioSpectrumObserver::onAudioSpectrumComputed "onAudioSpectrumComputed": Occurs when + * the SDK receives the audio data and at set intervals. + * + * @param observer A pointer to the audio spectrum observer: \ref agora::media::IAudioSpectrumObserver + * "IAudioSpectrumObserver". + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerAudioSpectrumObserver(agora::media::IAudioSpectrumObserver * observer) = 0; + /** + * Releases the audio spectrum observer. + * + * @param observer The pointer to the audio spectrum observer: \ref agora::media::IAudioSpectrumObserver + * "IAudioSpectrumObserver". + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int unregisterAudioSpectrumObserver(agora::media::IAudioSpectrumObserver * observer) = 0; + + /** Adjusts the recording volume. + + @param volume The recording volume, which ranges from 0 to 400: + + - 0: Mute the recording volume. + - 100: The Original volume. + - 400: (Maximum) Four times the original volume with signal clipping + protection. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int adjustRecordingSignalVolume(int volume) = 0; + + /** + * Mute or resume recording signal volume. + * + * @param mute Determines whether to mute or resume the recording signal volume. + * - true: Mute the recording signal volume. + * - false: (Default) Resume the recording signal volume. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int muteRecordingSignal(bool mute) = 0; + + /** Adjusts the playback volume. + + @param volume The playback volume, which ranges from 0 to 400: + + - 0: Mute the recoridng volume. + - 100: The Original volume. + - 400: (Maximum) Four times the original volume with signal clipping + protection. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int adjustPlaybackSignalVolume(int volume) = 0; + + /* + * Adjust the playback volume of the user specified by uid. + * + * You can call this method to adjust the playback volume of the user specified by uid + * in call. If you want to adjust playback volume of the multi user, you can call this + * this method multi times. + * + * @note + * Please call this method after join channel. + * This method adjust the playback volume of specified user. + * + * @param uid Remote user ID. + * @param volume The playback volume of the specified remote user. The value ranges between 0 and 400, including the following: + * 0: Mute. + * 100: (Default) Original volume. + * 400: Four times the original volume with signal-clipping protection. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int adjustUserPlaybackSignalVolume(uid_t uid, int volume) = 0; + + /** Sets the fallback option for the published video stream based on the network conditions. + + If `option` is set as #STREAM_FALLBACK_OPTION_AUDIO_ONLY (2), the SDK will: + + - Disable the upstream video but enable audio only when the network conditions deteriorate and cannot support both video and audio. + - Re-enable the video when the network conditions improve. + + When the published video stream falls back to audio only or when the audio-only stream switches back to the video, the SDK triggers the \ref agora::rtc::IRtcEngineEventHandler::onLocalPublishFallbackToAudioOnly "onLocalPublishFallbackToAudioOnly" callback. + + @note + - Agora does not recommend using this method for CDN live streaming, because the remote CDN live user will have a noticeable lag when the published video stream falls back to audio only. + - Ensure that you call this method before joining a channel. + + @param option Sets the fallback option for the published video stream: + - #STREAM_FALLBACK_OPTION_DISABLED (0): (Default) No fallback behavior for the published video stream when the uplink network condition is poor. The stream quality is not guaranteed. + - #STREAM_FALLBACK_OPTION_AUDIO_ONLY (2): The published video stream falls back to audio only when the uplink network condition is poor. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setLocalPublishFallbackOption(STREAM_FALLBACK_OPTIONS option) = 0; + + /** Sets the fallback option for the remotely subscribed video stream based on the network conditions. + + The default setting for `option` is #STREAM_FALLBACK_OPTION_VIDEO_STREAM_LOW (1), where the remotely subscribed video stream falls back to the low-stream video (low resolution and low bitrate) under poor downlink network conditions. + + If `option` is set as #STREAM_FALLBACK_OPTION_AUDIO_ONLY (2), the SDK automatically switches the video from a high-stream to a low-stream, or disables the video when the downlink network conditions cannot support both audio and video to guarantee the quality of the audio. The SDK monitors the network quality and restores the video stream when the network conditions improve. + + When the remotely subscribed video stream falls back to audio only or when the audio-only stream switches back to the video stream, the SDK triggers the \ref agora::rtc::IRtcEngineEventHandler::onRemoteSubscribeFallbackToAudioOnly "onRemoteSubscribeFallbackToAudioOnly" callback. + + @note Ensure that you call this method before joining a channel. + + @param option Sets the fallback option for the remotely subscribed video stream. See #STREAM_FALLBACK_OPTIONS. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setRemoteSubscribeFallbackOption(STREAM_FALLBACK_OPTIONS option) = 0; + + /** Sets the high priority user list and their fallback level in weak network condition. + * @note + * - This method can be called before and after joining a channel. + * - If a subscriber is set to high priority, this stream only fallback to lower stream after all normal priority users fallback to their fallback level on weak network condition if needed. + * + * @param uidList The high priority user list. + * @param uidNum The size of uidList. + * @param option The fallback level of high priority users. + * + * @return int + * - 0 : Success. + * - <0 : Failure. + */ + virtual int setHighPriorityUserList(uid_t* uidList, int uidNum, STREAM_FALLBACK_OPTIONS option) = 0; + + /** + * Enable/Disable an extension. + * By calling this function, you can dynamically enable/disable the extension without changing the pipeline. + * For example, enabling/disabling Extension_A means the data will be adapted/bypassed by Extension_A. + * + * NOTE: For compatibility reasons, if you haven't call registerExtension, + * enableExtension will automatically register the specified extension. + * We suggest you call registerExtension explicitly. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param extensionInfo The information for extension. + * @param enable Whether to enable the extension: + * - true: (Default) Enable the extension. + * - false: Disable the extension. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableExtension(const char* provider, const char* extension, const ExtensionInfo& extensionInfo, bool enable = true) = 0; + + /** + * Sets the properties of an extension. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param extensionInfo The information for extension. + * @param key The key of the extension. + * @param value The JSON formatted value of the extension key. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExtensionProperty(const char* provider, const char* extension, const ExtensionInfo& extensionInfo, const char* key, const char* value) = 0; + + /** + * Gets the properties of an extension. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param extensionInfo The information for extension. + * @param key The key of the extension. + * @param value The value of the extension key. + * @param buf_len Maximum length of the JSON string indicating the extension property. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getExtensionProperty(const char* provider, const char* extension, const ExtensionInfo& extensionInfo, const char* key, char* value, int buf_len) = 0; + + /** Enables loopback recording. + * + * If you enable loopback recording, the output of the default sound card is mixed into + * the audio stream sent to the other end. + * + * @note This method is for Windows only. + * + * @param enabled Sets whether to enable/disable loopback recording. + * - true: Enable loopback recording. + * - false: (Default) Disable loopback recording. + * @param deviceName Pointer to the device name of the sound card. The default value is NULL (the default sound card). + * - This method is for macOS and Windows only. + * - macOS does not support loopback capturing of the default sound card. If you need to use this method, + * please use a virtual sound card and pass its name to the deviceName parameter. Agora has tested and recommends using soundflower. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableLoopbackRecording(bool enabled, const char* deviceName = NULL) = 0; + + + /** Adjusts the loopback recording volume. + + @param volume The loopback volume, which ranges from 0 to 100: + + - 0: Mute the recoridng volume. + - 100: The Original volume. + protection. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int adjustLoopbackSignalVolume(int volume) = 0; + + /** Retrieves the audio volume for recording loopback. + @note Call this method when you are in a channel. + @return + - ≥ 0: The audio volume for loopback, if this method call succeeds. The value range is [0,100]. + - < 0: Failure. + */ + virtual int getLoopbackRecordingVolume() = 0; + + /** + * Enables in-ear monitoring. + * + * @param enabled Determines whether to enable in-ear monitoring. + * - true: Enable. + * - false: (Default) Disable. + * @param includeAudioFilters The type of the ear monitoring: #EAR_MONITORING_FILTER_TYPE + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableInEarMonitoring(bool enabled, int includeAudioFilters) = 0; + + /** + * Sets the volume of the in-ear monitor. + * + * @param volume Sets the volume of the in-ear monitor. The value ranges + * between 0 and 100 (default). + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setInEarMonitoringVolume(int volume) = 0; + +#if defined (_WIN32) || defined(__linux__) || defined(__ANDROID__) + virtual int loadExtensionProvider(const char* path, bool unload_after_use = false) = 0; +#endif + + /** + * Sets the provider property of an extension. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param key The key of the extension. + * @param value The JSON formatted value of the extension key. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExtensionProviderProperty(const char* provider, const char* key, const char* value) = 0; + + /** + * Registers an extension. Normally you should call this function immediately after engine initialization. + * Once an extension is registered, the SDK will automatically create and add it to the pipeline. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param type The source type of the extension, e.g. PRIMARY_CAMERA_SOURCE. The default is UNKNOWN_MEDIA_SOURCE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerExtension(const char* provider, const char* extension, agora::media::MEDIA_SOURCE_TYPE type = agora::media::UNKNOWN_MEDIA_SOURCE) = 0; + + /** + * Enable/Disable an extension. + * By calling this function, you can dynamically enable/disable the extension without changing the pipeline. + * For example, enabling/disabling Extension_A means the data will be adapted/bypassed by Extension_A. + * + * NOTE: For compatibility reasons, if you haven't call registerExtension, + * enableExtension will automatically register the specified extension. + * We suggest you call registerExtension explicitly. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param enable Whether to enable the extension: + * - true: (Default) Enable the extension. + * - false: Disable the extension. + * @param type The source type of the extension, e.g. PRIMARY_CAMERA_SOURCE. The default is UNKNOWN_MEDIA_SOURCE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableExtension(const char* provider, const char* extension, bool enable=true, agora::media::MEDIA_SOURCE_TYPE type = agora::media::UNKNOWN_MEDIA_SOURCE) = 0; + + /** + * Sets the properties of an extension. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param key The key of the extension. + * @param value The JSON formatted value of the extension key. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setExtensionProperty( + const char* provider, const char* extension, + const char* key, const char* value, agora::media::MEDIA_SOURCE_TYPE type = agora::media::UNKNOWN_MEDIA_SOURCE) = 0; + + /** + * Gets the properties of an extension. + * + * @param provider The name of the extension provider, e.g. agora.io. + * @param extension The name of the extension, e.g. agora.beauty. + * @param key The key of the extension. + * @param value The value of the extension key. + * @param buf_len Maximum length of the JSON string indicating the extension property. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getExtensionProperty( + const char* provider, const char* extension, + const char* key, char* value, int buf_len, agora::media::MEDIA_SOURCE_TYPE type = agora::media::UNKNOWN_MEDIA_SOURCE) = 0; + + /** Sets the camera capture configuration. + * @note Call this method before enabling the local camera. + * That said, you can call this method before calling \ref IRtcEngine::joinChannel "joinChannel", + * \ref IRtcEngine::enableVideo "enableVideo", or \ref IRtcEngine::enableLocalVideo "enableLocalVideo", + * depending on which method you use to turn on your local camera. + * + * @param config Sets the camera capturer configuration. See CameraCapturerConfiguration. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setCameraCapturerConfiguration(const CameraCapturerConfiguration& config) = 0; + + /** + * Get an custom video track id created by internal,which could used to publish or preview + * + * @return + * - > 0: the useable video track id. + * - < 0: Failure. + */ + virtual video_track_id_t createCustomVideoTrack() = 0; + + /** + * Get an custom encoded video track id created by internal,which could used to publish or preview + * + * @return + * - > 0: the useable video track id. + * - < 0: Failure. + */ + virtual video_track_id_t createCustomEncodedVideoTrack(const SenderOptions& sender_option) = 0; + + /** + * destroy a created custom video track id + * + * @param video_track_id The video track id which was created by createCustomVideoTrack + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int destroyCustomVideoTrack(video_track_id_t video_track_id) = 0; + + /** + * destroy a created custom encoded video track id + * + * @param video_track_id The video track id which was created by createCustomEncodedVideoTrack + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int destroyCustomEncodedVideoTrack(video_track_id_t video_track_id) = 0; + +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IOS) + /** + * Switches between front and rear cameras. + * + * @note This method applies to Android and iOS only. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int switchCamera() = 0; + + /** + * Checks whether the camera zoom function is supported. + * + * @return + * - true: The camera zoom function is supported. + * - false: The camera zoom function is not supported. + */ + virtual bool isCameraZoomSupported() = 0; + + /** + * Checks whether the camera face detect is supported. + * + * @return + * - true: The camera face detect is supported. + * - false: The camera face detect is not supported. + */ + virtual bool isCameraFaceDetectSupported() = 0; + + /** + * Checks whether the camera flash function is supported. + * + * @return + * - true: The camera flash function is supported. + * - false: The camera flash function is not supported. + */ + virtual bool isCameraTorchSupported() = 0; + + /** + * Checks whether the camera manual focus function is supported. + * + * @return + * - true: The camera manual focus function is supported. + * - false: The camera manual focus function is not supported. + */ + virtual bool isCameraFocusSupported() = 0; + + /** + * Checks whether the camera auto focus function is supported. + * + * @return + * - true: The camera auto focus function is supported. + * - false: The camera auto focus function is not supported. + */ + virtual bool isCameraAutoFocusFaceModeSupported() = 0; + + /** + * Sets the camera zoom ratio. + * + * @param factor The camera zoom factor. It ranges from 1.0 to the maximum zoom + * supported by the camera. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setCameraZoomFactor(float factor) = 0; + + /** + * Sets the camera face detection. + * + * @param enabled The camera face detection enabled. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableFaceDetection(bool enabled) = 0; + + /** + * Gets the maximum zoom ratio supported by the camera. + * @return The maximum zoom ratio supported by the camera. + */ + virtual float getCameraMaxZoomFactor() = 0; + + /** + * Sets the manual focus position. + * + * @param positionX The horizontal coordinate of the touch point in the view. + * @param positionY The vertical coordinate of the touch point in the view. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setCameraFocusPositionInPreview(float positionX, float positionY) = 0; + + /** + * Enables the camera flash. + * + * @param isOn Determines whether to enable the camera flash. + * - true: Enable the flash. + * - false: Do not enable the flash. + */ + virtual int setCameraTorchOn(bool isOn) = 0; + + /** + * Enables the camera auto focus face function. + * + * @param enabled Determines whether to enable the camera auto focus face mode. + * - true: Enable the auto focus face function. + * - false: Do not enable the auto focus face function. + */ + virtual int setCameraAutoFocusFaceModeEnabled(bool enabled) = 0; + + /** Checks whether the camera exposure function is supported. + * + * Ensure that you call this method after the camera starts, for example, by calling `startPreview` or `joinChannel`. + * + * @since v2.3.2. + * @return + *

      + *
    • true: The device supports the camera exposure function.
    • + *
    • false: The device does not support the camera exposure function.
    • + *
    + */ + virtual bool isCameraExposurePositionSupported() = 0; + + /** Sets the camera exposure position. + * + * Ensure that you call this method after the camera starts, for example, by calling `startPreview` or `joinChannel`. + * + * A successful setCameraExposurePosition method call triggers the {@link IRtcEngineEventHandler#onCameraExposureAreaChanged onCameraExposureAreaChanged} callback on the local client. + * @since v2.3.2. + * @param positionXinView The horizontal coordinate of the touch point in the view. + * @param positionYinView The vertical coordinate of the touch point in the view. + * + * @return + *
      + *
    • 0: Success.
    • + *
    • < 0: Failure.
    • + *
    + */ + virtual int setCameraExposurePosition(float positionXinView, float positionYinView) = 0; + + /** + * Returns whether exposure value adjusting is supported by the current device. + * Exposure compensation is in auto exposure mode. + * @since v4.2.2 + * @note + * This method only supports Android and iOS. + * This interface returns valid values only after the device is initialized. + * + * @return + * - true: exposure value adjusting is supported. + * - false: exposure value adjusting is not supported or device is not initialized. + */ + virtual bool isCameraExposureSupported() = 0; + + /** + * Sets the camera exposure ratio. + * @since v4.2.2 + * @param factor The camera zoom factor. The recommended camera exposure factor ranging from -8.0 to 8.0 for iOS, + * and -20.0 to 20.0 for Android. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setCameraExposureFactor(float factor) = 0; + +#if defined(__APPLE__) + /** + * Checks whether the camera auto exposure function is supported. + * + * @return + * - true: The camera auto exposure function is supported. + * - false: The camera auto exposure function is not supported. + */ + virtual bool isCameraAutoExposureFaceModeSupported() = 0; + + + /** + * Enables the camera auto exposure face function. + * + * @param enabled Determines whether to enable the camera auto exposure face mode. + * - true: Enable the auto exposure face function. + * - false: Do not enable the auto exposure face function. + */ + virtual int setCameraAutoExposureFaceModeEnabled(bool enabled) = 0; +#endif + + /** Sets the default audio route (for Android and iOS only). + + Most mobile phones have two audio routes: an earpiece at the top, and a + speakerphone at the bottom. The earpiece plays at a lower volume, and the + speakerphone at a higher volume. + + When setting the default audio route, you determine whether audio playback + comes through the earpiece or speakerphone when no external audio device is + connected. + + Depending on the scenario, Agora uses different default audio routes: + - Voice call: Earpiece + - Audio broadcast: Speakerphone + - Video call: Speakerphone + - Video broadcast: Speakerphone + + Call this method before, during, or after a call, to change the default + audio route. When the audio route changes, the SDK triggers the + \ref IRtcEngineEventHandler::onAudioRoutingChanged "onAudioRoutingChanged" + callback. + + @note The system audio route changes when an external audio device, such as + a headphone or a Bluetooth audio device, is connected. See *Principles for changing the audio route*. + + @param defaultToSpeaker Whether to set the speakerphone as the default audio + route: + - true: Set the speakerphone as the default audio route. + - false: Do not set the speakerphone as the default audio route. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setDefaultAudioRouteToSpeakerphone(bool defaultToSpeaker) = 0; + + /** Enables/Disables the speakerphone temporarily (for Android and iOS only). + + When the audio route changes, the SDK triggers the + \ref IRtcEngineEventHandler::onAudioRoutingChanged "onAudioRoutingChanged" + callback. + + You can call this method before, during, or after a call. However, Agora + recommends calling this method only when you are in a channel to change + the audio route temporarily. + + @note This method sets the audio route temporarily. Plugging in or + unplugging a headphone, or the SDK re-enabling the audio device module + (ADM) to adjust the media volume in some scenarios relating to audio, leads + to a change in the audio route. See *Principles for changing the audio + route*. + + @param speakerOn Whether to set the speakerphone as the temporary audio + route: + - true: Set the speakerphone as the audio route temporarily. (For iOS only: + calling setEnableSpeakerphone(true) does not change the audio route to the + speakerphone if a headphone or a Bluetooth audio device is connected.) + - false: Do not set the speakerphone as the audio route. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setEnableSpeakerphone(bool speakerOn) = 0; + + /** Checks whether the speakerphone is enabled (for Android and iOS only). + + @return + - true: The speakerphone is enabled, and the audio plays from the speakerphone. + - false: The speakerphone is not enabled, and the audio plays from devices + other than the speakerphone. For example, the headset or earpiece. + */ + virtual bool isSpeakerphoneEnabled() = 0; + + /** Select preferred route for android communication mode + + @param route The preferred route. For example, when a Bluetooth headset is connected, + you can use this API to switch the route to a wired headset. + @return meanless, route switch result is pass through CallbackOnRoutingChanged + */ + virtual int setRouteInCommunicationMode(int route) = 0; + +#endif // __ANDROID__ || (__APPLE__ && TARGET_OS_IOS) + +#if defined(_WIN32) || (defined(__APPLE__) && TARGET_OS_MAC && !TARGET_OS_IPHONE) + /** Get \ref ScreenCaptureSourceInfo list including available windows and screens. + * + * @param thumbSize Set expected size for thumb, image will be scaled accordingly. For windows, SIZE is defined in windef.h. + * @param iconSize Set expected size for icon, image will be scaled accordingly. For windows, SIZE is defined in windef.h. + * @param includeScreen Determines whether to include screens info. + * - true: sources will have screens info + * - false: source will only have windows info + * @return + * - IScreenCaptureSourceList* a pointer to an instance of IScreenCaptureSourceList + */ + virtual IScreenCaptureSourceList* getScreenCaptureSources(const SIZE& thumbSize, const SIZE& iconSize, const bool includeScreen) = 0; +#endif // _WIN32 || (__APPLE__ && !TARGET_OS_IPHONE && TARGET_OS_MAC) +#if (defined(__APPLE__) && TARGET_OS_IOS) + /** Sets the operational permission of the SDK on the audio session. + * + * The SDK and the app can both configure the audio session by default. If + * you need to only use the app to configure the audio session, this method + * restricts the operational permission of the SDK on the audio session. + * + * You can call this method either before or after joining a channel. Once + * you call this method to restrict the operational permission of the SDK + * on the audio session, the restriction takes effect when the SDK needs to + * change the audio session. + * + * @note + * - This method is for iOS only. + * - This method does not restrict the operational permission of the app on + * the audio session. + * + * @param restriction The operational permission of the SDK on the audio session. + * See #AUDIO_SESSION_OPERATION_RESTRICTION. This parameter is in bit mask + * format, and each bit corresponds to a permission. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAudioSessionOperationRestriction(AUDIO_SESSION_OPERATION_RESTRICTION restriction) = 0; +#endif // __APPLE__ && TARGET_OS_IOS + +#if defined(_WIN32) || (defined(__APPLE__) && !TARGET_OS_IPHONE && TARGET_OS_MAC) + + /** Shares the whole or part of a screen by specifying the display ID. + + @note This method applies to macOS only. + + @param displayId The display ID of the screen to be shared. This parameter + specifies which screen you want to share. For information on how to get the + displayId, see the advanced guide: Share the Screen. + @param regionRect (Optional) Sets the relative location of the region to the + screen. NIL means sharing the whole screen. See Rectangle. + If the specified region overruns the screen, the SDK shares only the region + within it; if you set width or height as 0, the SDK shares the whole screen. + @param captureParams Sets the screen sharing encoding parameters. See + ScreenCaptureParameters. + + @return + - 0: Success. + - < 0: Failure: + - ERR_INVALID_ARGUMENT (2): The argument is invalid. + - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when try to start screen capture. + */ + virtual int startScreenCaptureByDisplayId(uint32_t displayId, const Rectangle& regionRect, + const ScreenCaptureParameters& captureParams) = 0; + +#endif // __APPLE__ && TARGET_OS_MAC && !TARGET_OS_IPHONE + +#if defined(_WIN32) + /** + * Shares the whole or part of a screen by specifying the screen rect. + * + * @deprecated This method is deprecated, use \ref IRtcEngine::startScreenCaptureByDisplayId "startScreenCaptureByDisplayId" instead. Agora strongly recommends using `startScreenCaptureByDisplayId` if you need to start screen sharing on a device connected to another display. + * + * @note This method applies to Windows only. + * + * @param screenRect Sets the relative location of the screen to the virtual + * screen. For information on how to get screenRect, see the advanced guide: + * Share the Screen. + * @param regionRect (Optional) Sets the relative location of the region to the + * screen. NULL means sharing the whole screen. See Rectangle. + * If the specified region overruns the screen, the SDK shares only the region + * within it; if you set width or height as 0, the SDK shares the whole screen. + * @param captureParams Sets the screen sharing encoding parameters. See + * ScreenCaptureParameters. + * + * @return + * - 0: Success. + * - < 0: Failure: + * - ERR_INVALID_ARGUMENT (2): The argument is invalid. + * - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when try to start screen capture. + */ + virtual int startScreenCaptureByScreenRect(const Rectangle& screenRect, + const Rectangle& regionRect, + const ScreenCaptureParameters& captureParams) __deprecated = 0; +#endif + +#if defined(__ANDROID__) + /** + * Gets the the Audio device Info + * @return + * - 0: Success. + * - < 0: Failure.. + */ + virtual int getAudioDeviceInfo(DeviceInfo& deviceInfo) = 0; + +#endif // __ANDROID__ + +#if defined(_WIN32) || (defined(__APPLE__) && TARGET_OS_MAC && !TARGET_OS_IPHONE) + + /** Shares the whole or part of a window by specifying the window ID. + * + * @param windowId The ID of the window to be shared. For information on how to + * get the windowId, see the advanced guide *Share Screen*. + * @param regionRect (Optional) The relative location of the region to the + * window. NULL means sharing the whole window. See Rectangle. If the + * specified region overruns the window, the SDK shares only the region within + * it; if you set width or height as 0, the SDK shares the whole window. + * @param captureParams The window sharing encoding parameters. See + * ScreenCaptureParameters. + * + * @return + * - 0: Success. + * - < 0: Failure: + * - ERR_INVALID_ARGUMENT (2): The argument is invalid. + * - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when try to start screen capture. + */ + virtual int startScreenCaptureByWindowId(view_t windowId, const Rectangle& regionRect, + const ScreenCaptureParameters& captureParams) = 0; + + /** + * Sets the content hint for screen sharing. + * + * A content hint suggests the type of the content being shared, so that the SDK applies different + * optimization algorithm to different types of content. + * + * @param contentHint Sets the content hint for screen sharing: #VIDEO_CONTENT_HINT. + * + * @return + * - 0: Success. + * - < 0: Failure: + * - ERR_NOT_SUPPORTED (4): unable to set screencapture content hint + * - ERR_FAILED (1): A general error occurs (no specified reason). + * - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when set screen capture content hint. + */ + virtual int setScreenCaptureContentHint(VIDEO_CONTENT_HINT contentHint) = 0; + + /** + * Updates the screen sharing region. + * + * @param regionRect Sets the relative location of the region to the screen or + * window. NULL means sharing the whole screen or window. See Rectangle. + * If the specified region overruns the screen or window, the SDK shares only + * the region within it; if you set width or height as 0, the SDK shares the + * whole screen or window. + * + * @return + * - 0: Success. + * - < 0: Failure: + * - ERR_NOT_SUPPORTED (4): unable to update screen capture region + * - ERR_FAILED (1): A general error occurs (no specified reason). + * - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when update screen capture regoin. + */ + virtual int updateScreenCaptureRegion(const Rectangle& regionRect) = 0; + + /** + * Updates the screen sharing parameters. + * + * @param captureParams Sets the screen sharing encoding parameters: ScreenCaptureParameters. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - ERR_NOT_SUPPORTED (4): unable to update screen capture parameters + * - ERR_INVALID_ARGUMENT (2): The argument is invalid. + * - ERR_FAILED (1): A general error occurs (no specified reason). + * - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when update screen capture parameters. + */ + virtual int updateScreenCaptureParameters(const ScreenCaptureParameters& captureParams) = 0; +#endif // _WIN32 || (__APPLE__ && !TARGET_OS_IPHONE && TARGET_OS_MAC) + +#if defined(__ANDROID__) || (defined(__APPLE__) && TARGET_OS_IOS) + /** + * Starts screen sharing. + * + * @param captureParams The configuration of the screen sharing. See {@link + * ScreenCaptureParameters ScreenCaptureParameters}. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startScreenCapture(const ScreenCaptureParameters2& captureParams) = 0; + + /** + * Updates the screen sharing configuration. + * + * @param captureParams The configuration of the screen sharing. See {@link + * ScreenCaptureParameters ScreenCaptureParameters}. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int updateScreenCapture(const ScreenCaptureParameters2& captureParams) = 0; + + /** + * Queries the ability of screen sharing to support the maximum frame rate. + * + * @since v4.2.0 + * + * @return + * - 0: support 15 fps, Low devices. + * - 1: support 30 fps, Usually low - to mid-range devices. + * - 2: support 60 fps, Advanced devices. + * - < 0: Failure. + */ + virtual int queryScreenCaptureCapability() = 0; +#endif + +#if defined(_WIN32) || defined(__APPLE__) || defined(__ANDROID__) + /** + * Sets the screen sharing scenario. + * + * + * When you start screen sharing or window sharing, you can call this method to set the screen sharing scenario. The SDK adjusts the video quality and experience of the sharing according to the scenario. + * + * + * @param screenScenario The screen sharing scenario. See #SCREEN_SCENARIO_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - ERR_NOT_SUPPORTED (4): unable to set screencapture scenario + * - ERR_FAILED (1): A general error occurs (no specified reason). + * - ERR_NOT_INITIALIZED (7): You have not initialized IRtcEngine when set screencapture scenario. + */ + virtual int setScreenCaptureScenario(SCREEN_SCENARIO_TYPE screenScenario) = 0; + + /** + * Stops the screen sharing. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopScreenCapture() = 0; +#endif // _WIN32 || (__APPLE__ && !TARGET_OS_IPHONE && TARGET_OS_MAC) || __ANDROID__ + + /** + * Gets the current call ID. + * + * When a user joins a channel on a client, a `callId` is generated to identify + * the call. + * + * After a call ends, you can call `rate` or `complain` to gather feedback from the customer. + * These methods require a `callId` parameter. To use these feedback methods, call the this + * method first to retrieve the `callId` during the call, and then pass the value as an + * argument in the `rate` or `complain` method after the call ends. + * + * @param callId The reference to the call ID. + * @return + * - The call ID if the method call is successful. + * - < 0: Failure. + */ + virtual int getCallId(agora::util::AString& callId) = 0; + + /** + * Allows a user to rate the call. + * + * It is usually called after the call ends. + * + * @param callId The call ID retrieved from the \ref getCallId "getCallId" method. + * @param rating The rating of the call between 1 (the lowest score) to 5 (the highest score). + * @param description (Optional) The description of the rating. The string length must be less than + * 800 bytes. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int rate(const char* callId, int rating, const char* description) = 0; // 0~10 + + /** + * Allows a user to complain about the call quality. + * + * This method is usually called after the call ends. + * + * @param callId The call ID retrieved from the `getCallId` method. + * @param description (Optional) The description of the complaint. The string length must be less than + * 800 bytes. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int complain(const char* callId, const char* description) = 0; + + /** Publishes the local stream without transcoding to a specified CDN live RTMP address. (CDN live only.) + + * The SDK returns the result of this method call in the \ref IRtcEngineEventHandler::onStreamPublished "onStreamPublished" callback. + + * The \ref agora::rtc::IRtcEngine::startRtmpStreamWithoutTranscoding "startRtmpStreamWithoutTranscoding" method call triggers the \ref agora::rtc::IRtcEngineEventHandler::onRtmpStreamingStateChanged "onRtmpStreamingStateChanged" callback on the local client to report the state of adding a local stream to the CDN. + * @note + * - Ensure that the user joins the channel before calling this method. + * - This method adds only one stream RTMP URL address each time it is called. + * - The RTMP URL address must not contain special characters, such as Chinese language characters. + * - This method applies to Live Broadcast only. + + * @param url The CDN streaming URL in the RTMP format. The maximum length of this parameter is 1024 bytes. + + * @return + * - 0: Success. + * - < 0: Failure. + * - #ERR_INVALID_ARGUMENT (2): The RTMP URL address is NULL or has a string length of 0. + * - #ERR_NOT_INITIALIZED (7): You have not initialized the RTC engine when publishing the stream. + * - #ERR_ALREADY_IN_USE (19): This streaming URL is already in use. Use a new streaming URL for CDN streaming. + */ + virtual int startRtmpStreamWithoutTranscoding(const char* url) = 0; + + /** Publishes the local stream with transcoding to a specified CDN live RTMP address. (CDN live only.) + + * The SDK returns the result of this method call in the \ref IRtcEngineEventHandler::onStreamPublished "onStreamPublished" callback. + + * The \ref agora::rtc::IRtcEngine::startRtmpStreamWithTranscoding "startRtmpStreamWithTranscoding" method call triggers the \ref agora::rtc::IRtcEngineEventHandler::onRtmpStreamingStateChanged "onRtmpStreamingStateChanged" callback on the local client to report the state of adding a local stream to the CDN. + * @note + * - Ensure that the user joins the channel before calling this method. + * - This method adds only one stream RTMP URL address each time it is called. + * - The RTMP URL address must not contain special characters, such as Chinese language characters. + * - This method applies to Live Broadcast only. + + * @param url The CDN streaming URL in the RTMP format. The maximum length of this parameter is 1024 bytes. + * @param transcoding Sets the CDN live audio/video transcoding settings. See LiveTranscoding. + + * @return + * - 0: Success. + * - < 0: Failure. + * - #ERR_INVALID_ARGUMENT (2): The RTMP URL address is NULL or has a string length of 0. + * - #ERR_NOT_INITIALIZED (7): You have not initialized the RTC engine when publishing the stream. + * - #ERR_ALREADY_IN_USE (19): This streaming URL is already in use. Use a new streaming URL for CDN streaming. + */ + virtual int startRtmpStreamWithTranscoding(const char* url, const LiveTranscoding& transcoding) = 0; + + /** Update the video layout and audio settings for CDN live. (CDN live only.) + * @note This method applies to Live Broadcast only. + + * @param transcoding Sets the CDN live audio/video transcoding settings. See LiveTranscoding. + + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int updateRtmpTranscoding(const LiveTranscoding& transcoding) = 0; + + virtual int startLocalVideoTranscoder(const LocalTranscoderConfiguration& config) = 0; + virtual int updateLocalTranscoderConfiguration(const LocalTranscoderConfiguration& config) = 0; + + /** Stop an RTMP stream with transcoding or without transcoding from the CDN. (CDN live only.) + + * This method removes the RTMP URL address (added by the \ref IRtcEngine::startRtmpStreamWithoutTranscoding "startRtmpStreamWithoutTranscoding" method + * or IRtcEngine::startRtmpStreamWithTranscoding "startRtmpStreamWithTranscoding" method) from a CDN live stream. + * The SDK returns the result of this method call in the \ref IRtcEngineEventHandler::onStreamUnpublished "onStreamUnpublished" callback. + + * The \ref agora::rtc::IRtcEngine::stopRtmpStream "stopRtmpStream" method call triggers the \ref agora::rtc::IRtcEngineEventHandler::onRtmpStreamingStateChanged "onRtmpStreamingStateChanged" callback on the local client to report the state of removing an RTMP stream from the CDN. + * @note + * - This method removes only one RTMP URL address each time it is called. + * - The RTMP URL address must not contain special characters, such as Chinese language characters. + * - This method applies to Live Broadcast only. + + * @param url The RTMP URL address to be removed. The maximum length of this parameter is 1024 bytes. + + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopRtmpStream(const char* url) = 0; + + virtual int stopLocalVideoTranscoder() = 0; + /** + * Starts video capture with a camera. + * + * @param config The configuration of the video capture with a primary camera. For details, see CameraCaptureConfiguration. + * @param sourceType Source type of camera. See #VIDEO_SOURCE_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startCameraCapture(VIDEO_SOURCE_TYPE sourceType, const CameraCapturerConfiguration& config) = 0; + + /** + * Stops capturing video through camera. + * + * You can call this method to stop capturing video through the first camera after calling `startCameraCapture`. + * + * @param sourceType Source type of camera. See #VIDEO_SOURCE_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopCameraCapture(VIDEO_SOURCE_TYPE sourceType) = 0; + /** + * Sets the rotation angle of the video captured by the camera. + * + * When the video capture device does not have the gravity sensing function, you can call this method to manually adjust the rotation angle of the captured video. + * + * @param type The video source type. See #VIDEO_SOURCE_TYPE. + * @param orientation The clockwise rotation angle. See #VIDEO_ORIENTATION. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setCameraDeviceOrientation(VIDEO_SOURCE_TYPE type, VIDEO_ORIENTATION orientation) = 0; + /** + * Sets the rotation angle of the video captured by the screen. + * + * When the screen capture device does not have the gravity sensing function, you can call this method to manually adjust the rotation angle of the captured video. + * + * @param type The video source type. See #VIDEO_SOURCE_TYPE. + * @param orientation The clockwise rotation angle. See #VIDEO_ORIENTATION. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setScreenCaptureOrientation(VIDEO_SOURCE_TYPE type, VIDEO_ORIENTATION orientation) = 0; + + /** + * Starts sharing a screen. + * + * @param config The configuration of the captured screen. For details, see ScreenCaptureConfiguration. + * @param sourceType source type of screen. See #VIDEO_SOURCE_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startScreenCapture(VIDEO_SOURCE_TYPE sourceType, const ScreenCaptureConfiguration& config) = 0; + + /** + * Stop sharing the screen. + * + * After calling `startScreenCapture`, you can call this method to stop sharing the first screen. + * + * @param sourceType source type of screen. See #VIDEO_SOURCE_TYPE. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopScreenCapture(VIDEO_SOURCE_TYPE sourceType) = 0; + + /** Gets the current connection state of the SDK. + + @return #CONNECTION_STATE_TYPE. + */ + virtual CONNECTION_STATE_TYPE getConnectionState() = 0; + + // The following APIs are not implemented yet. + virtual bool registerEventHandler(IRtcEngineEventHandler* eventHandler) = 0; + virtual bool unregisterEventHandler(IRtcEngineEventHandler* eventHandler) = 0; + virtual int setRemoteUserPriority(uid_t uid, PRIORITY_TYPE userPriority) = 0; + + /** + * Registers a packet observer. + * + * The Agora Native SDK allows your app to register a packet observer to + * receive events whenever a voice or video packet is transmitting. + * + * @param observer The IPacketObserver object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerPacketObserver(IPacketObserver* observer) = 0; + + /** + * Sets the built-in encryption mode. + * + * @deprecated This method is deprecated. Use enableEncryption(bool enabled, const EncryptionConfig&) instead. + * + * The Agora Native SDK supports built-in encryption. + * Call this API to set the encryption mode. + * + * All users in the same channel must use the same encryption mode and password. + * Refer to information related to the encryption algorithm on the differences + * between encryption modes. + * + * @note + * Call \ref setEncryptionSecret "setEncryptionSecret" to enable the built-in encryption function + * before calling this API. + * @param encryptionMode Encryption mode: + * - "sm4-128-ecb": 128-bit SM4 encryption, ECB mode. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setEncryptionMode(const char* encryptionMode) __deprecated = 0; + + /** + * Enables built-in encryption. + * + * @deprecated This method is deprecated. Use enableEncryption(bool enabled, const EncryptionConfig&) instead. + * + * Use this method to specify an encryption password to enable built-in + * encryption before joining a channel. All users in a channel must set the same + * encryption password. The encryption password is automatically cleared once a + * user has left the channel. If the encryption password is not specified or set to + * empty, the encryption function will be disabled. + * + * @param secret The encryption password. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setEncryptionSecret(const char* secret) __deprecated = 0; + + /** Enables/Disables the built-in encryption. + * + * In scenarios requiring high security, Agora recommends calling this method to enable the built-in encryption before joining a channel. + * + * All users in the same channel must use the same encryption mode and encryption key. Once all users leave the channel, the encryption key of this channel is automatically cleared. + * + * @note + * - If you enable the built-in encryption, you cannot use the RTMP streaming function. + * + * @param enabled Whether to enable the built-in encryption: + * - true: Enable the built-in encryption. + * - false: Disable the built-in encryption. + * @param config Configurations of built-in encryption schemas. See EncryptionConfig. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2(ERR_INVALID_ARGUMENT): An invalid parameter is used. Set the parameter with a valid value. + * - -4(ERR_NOT_SUPPORTED): The encryption mode is incorrect or the SDK fails to load the external encryption library. Check the enumeration or reload the external encryption library. + * - -7(ERR_NOT_INITIALIZED): The SDK is not initialized. Initialize the `IRtcEngine` instance before calling this method. + */ + virtual int enableEncryption(bool enabled, const EncryptionConfig& config) = 0; + + /** Creates a data stream. + * + * You can call this method to create a data stream and improve the + * reliability and ordering of data tranmission. + * + * @note + * - Ensure that you set the same value for `reliable` and `ordered`. + * - Each user can only create a maximum of 5 data streams during a RtcEngine + * lifecycle. + * - The data channel allows a data delay of up to 5 seconds. If the receiver + * does not receive the data stream within 5 seconds, the data channel reports + * an error. + * + * @param[out] streamId The ID of the stream data. + * @param reliable Sets whether the recipients are guaranteed to receive + * the data stream from the sender within five seconds: + * - true: The recipients receive the data stream from the sender within + * five seconds. If the recipient does not receive the data stream within + * five seconds, an error is reported to the application. + * - false: There is no guarantee that the recipients receive the data stream + * within five seconds and no error message is reported for any delay or + * missing data stream. + * @param ordered Sets whether the recipients receive the data stream + * in the sent order: + * - true: The recipients receive the data stream in the sent order. + * - false: The recipients do not receive the data stream in the sent order. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int createDataStream(int* streamId, bool reliable, bool ordered) = 0; + + /** Creates a data stream. + * + * Each user can create up to five data streams during the lifecycle of the IChannel. + * @param streamId The ID of the created data stream. + * @param config The config of data stream. + * @return int + * - Returns 0: Success. + * - < 0: Failure. + */ + virtual int createDataStream(int* streamId, const DataStreamConfig& config) = 0; + + /** Sends a data stream. + * + * After calling \ref IRtcEngine::createDataStream "createDataStream", you can call + * this method to send a data stream to all users in the channel. + * + * The SDK has the following restrictions on this method: + * - Up to 60 packets can be sent per second in a channel with each packet having a maximum size of 1 KB. + * - Each client can send up to 30 KB of data per second. + * - Each user can have up to five data streams simultaneously. + * + * After the remote user receives the data stream within 5 seconds, the SDK triggers the + * \ref IRtcEngineEventHandler::onStreamMessage "onStreamMessage" callback on + * the remote client. After the remote user does not receive the data stream within 5 seconds, + * the SDK triggers the \ref IRtcEngineEventHandler::onStreamMessageError "onStreamMessageError" + * callback on the remote client. + * + * @note + * - Call this method after calling \ref IRtcEngine::createDataStream "createDataStream". + * - This method applies only to the `COMMUNICATION` profile or to + * the hosts in the `LIVE_BROADCASTING` profile. If an audience in the + * `LIVE_BROADCASTING` profile calls this method, the audience may be switched to a host. + * + * @param streamId The ID of the stream data. + * @param data The data stream. + * @param length The length (byte) of the data stream. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int sendStreamMessage(int streamId, const char* data, size_t length) = 0; + + /** **DEPRECATED** Adds a watermark image to the local video or CDN live stream. + + This method is not recommend, Use \ref agora::rtc::IRtcEngine::addVideoWatermark(const char* watermarkUrl, const WatermarkOptions& options) "addVideoWatermark"2 instead. + + This method adds a PNG watermark image to the local video stream for the recording device, channel audience, and CDN live audience to view and capture. + + To add the PNG file to the CDN live publishing stream, see the \ref IRtcEngine::setLiveTranscoding "setLiveTranscoding" method. + + @param watermark Pointer to the watermark image to be added to the local video stream. See RtcImage. + + @note + - The URL descriptions are different for the local video and CDN live streams: + - In a local video stream, `url` in RtcImage refers to the absolute path of the added watermark image file in the local video stream. + - In a CDN live stream, `url` in RtcImage refers to the URL address of the added watermark image in the CDN live broadcast. + - The source file of the watermark image must be in the PNG file format. If the width and height of the PNG file differ from your settings in this method, the PNG file will be cropped to conform to your settings. + - The Agora SDK supports adding only one watermark image onto a local video or CDN live stream. The newly added watermark image replaces the previous one. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int addVideoWatermark(const RtcImage& watermark) __deprecated = 0; + + /** Adds a watermark image to the local video. + + This method adds a PNG watermark image to the local video in a live broadcast. Once the watermark image is added, all the audience in the channel (CDN audience included), + and the recording device can see and capture it. Agora supports adding only one watermark image onto the local video, and the newly watermark image replaces the previous one. + + The watermark position depends on the settings in the \ref IRtcEngine::setVideoEncoderConfiguration "setVideoEncoderConfiguration" method: + - If the orientation mode of the encoding video is #ORIENTATION_MODE_FIXED_LANDSCAPE, or the landscape mode in #ORIENTATION_MODE_ADAPTIVE, the watermark uses the landscape orientation. + - If the orientation mode of the encoding video is #ORIENTATION_MODE_FIXED_PORTRAIT, or the portrait mode in #ORIENTATION_MODE_ADAPTIVE, the watermark uses the portrait orientation. + - When setting the watermark position, the region must be less than the dimensions set in the `setVideoEncoderConfiguration` method. Otherwise, the watermark image will be cropped. + + @note + - Ensure that you have called the \ref agora::rtc::IRtcEngine::enableVideo "enableVideo" method to enable the video module before calling this method. + - If you only want to add a watermark image to the local video for the audience in the CDN live broadcast channel to see and capture, you can call this method or the \ref agora::rtc::IRtcEngine::setLiveTranscoding "setLiveTranscoding" method. + - This method supports adding a watermark image in the PNG file format only. Supported pixel formats of the PNG image are RGBA, RGB, Palette, Gray, and Alpha_gray. + - If the dimensions of the PNG image differ from your settings in this method, the image will be cropped or zoomed to conform to your settings. + - If you have enabled the local video preview by calling the \ref agora::rtc::IRtcEngine::startPreview "startPreview" method, you can use the `visibleInPreview` member in the WatermarkOptions class to set whether or not the watermark is visible in preview. + - If you have enabled the mirror mode for the local video, the watermark on the local video is also mirrored. To avoid mirroring the watermark, Agora recommends that you do not use the mirror and watermark functions for the local video at the same time. You can implement the watermark function in your application layer. + + @param watermarkUrl The local file path of the watermark image to be added. This method supports adding a watermark image from the local absolute or relative file path. + @param options Pointer to the watermark's options to be added. See WatermarkOptions for more infomation. + + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int addVideoWatermark(const char* watermarkUrl, const WatermarkOptions& options) = 0; + + /** Removes the watermark image on the video stream added by + addVideoWatermark(). + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int clearVideoWatermarks() = 0; + + // The following APIs are either deprecated and going to deleted. + + /** @deprecated Use disableAudio() instead. + + Disables the audio function in the channel. + + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int pauseAudio() __deprecated = 0; + /** @deprecated Use enableAudio() instead. + + Resumes the audio function in the channel. + + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int resumeAudio() __deprecated = 0; + + /** + * Enables interoperability with the Agora Web SDK (Live Broadcast only). + * + * @deprecated The Agora NG SDK enables the interoperablity with the Web SDK. + * + * Use this method when the channel profile is Live Broadcast. Interoperability + * with the Agora Web SDK is enabled by default when the channel profile is + * Communication. + * + * @param enabled Determines whether to enable interoperability with the Agora Web SDK. + * - true: (Default) Enable interoperability with the Agora Native SDK. + * - false: Disable interoperability with the Agora Native SDK. + * + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableWebSdkInteroperability(bool enabled) __deprecated = 0; + + /** Agora supports reporting and analyzing customized messages. + * + * This function is in the beta stage with a free trial. The ability provided + * in its beta test version is reporting a maximum of 10 message pieces within + * 6 seconds, with each message piece not exceeding 256 bytes. + * + * To try out this function, contact [support@agora.io](mailto:support@agora.io) + * and discuss the format of customized messages with us. + */ + virtual int sendCustomReportMessage(const char* id, const char* category, const char* event, const char* label, int value) = 0; + + /** Registers the metadata observer. + + You need to implement the IMetadataObserver class and specify the metadata type + in this method. This method enables you to add synchronized metadata in the video + stream for more diversified live interactive streaming, such as sending + shopping links, digital coupons, and online quizzes. + + A successful call of this method triggers + the \ref agora::rtc::IMetadataObserver::getMaxMetadataSize "getMaxMetadataSize" callback. + + @note + - Call this method before the `joinChannel` method. + - This method applies to the `LIVE_BROADCASTING` channel profile. + + @param observer IMetadataObserver. + @param type The metadata type. See \ref IMetadataObserver::METADATA_TYPE "METADATA_TYPE". The SDK supports VIDEO_METADATA (0) only for now. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int registerMediaMetadataObserver(IMetadataObserver *observer, IMetadataObserver::METADATA_TYPE type) = 0; + + /** Unregisters the metadata observer. + @param observer IMetadataObserver. + @param type The metadata type. See \ref IMetadataObserver::METADATA_TYPE "METADATA_TYPE". The SDK supports VIDEO_METADATA (0) only for now. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int unregisterMediaMetadataObserver(IMetadataObserver* observer, IMetadataObserver::METADATA_TYPE type) = 0; + + /** Start audio frame dump. + + Optional `location` is: "pre_apm_proc", "apm", "pre_send_proc", "filter", "enc", "tx_mixer", + "at_record", "atw_record" for audio sending. + "dec", "mixed", "play", "rx_mixer", "playback_mixer", "pcm_source_playback_mixer", + "pre_play_proc", "at_playout", "atw_playout" for audio receiving. + + */ + virtual int startAudioFrameDump(const char* channel_id, uid_t uid, const char* location, const char* uuid, const char* passwd, long duration_ms, bool auto_upload) = 0; + + /** + * Stops the audio frame dump. + */ + virtual int stopAudioFrameDump(const char* channel_id, uid_t uid, const char* location) = 0; + + /** + * Enables/Disables Agora AI Noise Suppression(AINS) with preset mode. + * + * @param enabled Sets whether or not to enable AINS. + * - true: Enables the AINS. + * - false: Disables the AINS. + * @param mode The preset AINS mode, range is [0,1,2]: + * 0: AINS mode with soft suppression level. + * 1: AINS mode with aggressive suppression level. + * 2: AINS mode with aggressive suppression level and low algorithm latency. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAINSMode(bool enabled, AUDIO_AINS_MODE mode) = 0; + + /** Registers a user account. + * + * Once registered, the user account can be used to identify the local user when the user joins the channel. + * After the user successfully registers a user account, the SDK triggers the \ref agora::rtc::IRtcEngineEventHandler::onLocalUserRegistered "onLocalUserRegistered" callback on the local client, + * reporting the user ID and user account of the local user. + * + * To join a channel with a user account, you can choose either of the following: + * + * - Call the \ref agora::rtc::IRtcEngine::registerLocalUserAccount "registerLocalUserAccount" method to create a user account, and then the \ref agora::rtc::IRtcEngine::joinChannelWithUserAccount "joinChannelWithUserAccount" method to join the channel. + * - Call the \ref agora::rtc::IRtcEngine::joinChannelWithUserAccount "joinChannelWithUserAccount" method to join the channel. + * + * The difference between the two is that for the former, the time elapsed between calling the \ref agora::rtc::IRtcEngine::joinChannelWithUserAccount "joinChannelWithUserAccount" method + * and joining the channel is shorter than the latter. + * + * @note + * - Ensure that you set the `userAccount` parameter. Otherwise, this method does not take effect. + * - Ensure that the value of the `userAccount` parameter is unique in the channel. + * - To ensure smooth communication, use the same parameter type to identify the user. For example, if a user joins the channel with a user ID, then ensure all the other users use the user ID too. The same applies to the user account. If a user joins the channel with the Agora Web SDK, ensure that the uid of the user is set to the same parameter type. + * + * @param appId The App ID of your project. + * @param userAccount The user account. The maximum length of this parameter is 255 bytes. Ensure that you set this parameter and do not set it as null. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int registerLocalUserAccount(const char* appId, const char* userAccount) = 0; + + /** Joins the channel with a user account. + * + * After the user successfully joins the channel, the SDK triggers the following callbacks: + * + * - The local client: \ref agora::rtc::IRtcEngineEventHandler::onLocalUserRegistered "onLocalUserRegistered" and \ref agora::rtc::IRtcEngineEventHandler::onJoinChannelSuccess "onJoinChannelSuccess" . + * - The remote client: \ref agora::rtc::IRtcEngineEventHandler::onUserJoined "onUserJoined" and \ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" , if the user joining the channel is in the `COMMUNICATION` profile, or is a host in the `LIVE_BROADCASTING` profile. + * + * @note To ensure smooth communication, use the same parameter type to identify the user. For example, if a user joins the channel with a user ID, then ensure all the other users use the user ID too. The same applies to the user account. + * If a user joins the channel with the Agora Web SDK, ensure that the uid of the user is set to the same parameter type. + * + * @param token The token generated at your server: + * - For low-security requirements: You can use the temporary token generated at Console. For details, see [Get a temporary toke](https://docs.agora.io/en/Voice/token?platform=All%20Platforms#get-a-temporary-token). + * - For high-security requirements: Set it as the token generated at your server. For details, see [Get a token](https://docs.agora.io/en/Voice/token?platform=All%20Platforms#get-a-token). + * @param channelId The channel name. The maximum length of this parameter is 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param userAccount The user account. The maximum length of this parameter is 255 bytes. Ensure that you set this parameter and do not set it as null. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int joinChannelWithUserAccount(const char* token, const char* channelId, const char* userAccount) = 0; + + /** Joins the channel with a user account. + * + * After the user successfully joins the channel, the SDK triggers the following callbacks: + * + * - The local client: \ref agora::rtc::IRtcEngineEventHandler::onLocalUserRegistered "onLocalUserRegistered" and \ref agora::rtc::IRtcEngineEventHandler::onJoinChannelSuccess "onJoinChannelSuccess" . + * - The remote client: \ref agora::rtc::IRtcEngineEventHandler::onUserJoined "onUserJoined" and \ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" , if the user joining the channel is in the `COMMUNICATION` profile, or is a host in the `LIVE_BROADCASTING` profile. + * + * @note To ensure smooth communication, use the same parameter type to identify the user. For example, if a user joins the channel with a user ID, then ensure all the other users use the user ID too. The same applies to the user account. + * If a user joins the channel with the Agora Web SDK, ensure that the uid of the user is set to the same parameter type. + * + * @param token The token generated at your server: + * - For low-security requirements: You can use the temporary token generated at Console. For details, see [Get a temporary toke](https://docs.agora.io/en/Voice/token?platform=All%20Platforms#get-a-temporary-token). + * - For high-security requirements: Set it as the token generated at your server. For details, see [Get a token](https://docs.agora.io/en/Voice/token?platform=All%20Platforms#get-a-token). + * @param channelId The channel name. The maximum length of this parameter is 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param userAccount The user account. The maximum length of this parameter is 255 bytes. Ensure that you set this parameter and do not set it as null. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param options The channel media options: \ref agora::rtc::ChannelMediaOptions::ChannelMediaOptions "ChannelMediaOptions" + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int joinChannelWithUserAccount(const char* token, const char* channelId, const char* userAccount, const ChannelMediaOptions& options) = 0; + + /** Joins the channel with a user account. + * + * After the user successfully joins the channel, the SDK triggers the following callbacks: + * + * - The local client: \ref agora::rtc::IRtcEngineEventHandler::onLocalUserRegistered "onLocalUserRegistered" and \ref agora::rtc::IRtcEngineEventHandler::onJoinChannelSuccess "onJoinChannelSuccess" . + * - The remote client: \ref agora::rtc::IRtcEngineEventHandler::onUserJoined "onUserJoined" and \ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" , if the user joining the channel is in the `COMMUNICATION` profile, or is a host in the `LIVE_BROADCASTING` profile. + * + * @note To ensure smooth communication, use the same parameter type to identify the user. For example, if a user joins the channel with a user ID, then ensure all the other users use the user ID too. The same applies to the user account. + * If a user joins the channel with the Agora Web SDK, ensure that the uid of the user is set to the same parameter type. + * + * @param token The token generated at your server: + * - For low-security requirements: You can use the temporary token generated at Console. For details, see [Get a temporary toke](https://docs.agora.io/en/Voice/token?platform=All%20Platforms#get-a-temporary-token). + * - For high-security requirements: Set it as the token generated at your server. For details, see [Get a token](https://docs.agora.io/en/Voice/token?platform=All%20Platforms#get-a-token). + * @param channelId The channel name. The maximum length of this parameter is 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param userAccount The user account. The maximum length of this parameter is 255 bytes. Ensure that you set this parameter and do not set it as null. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + * @param options The channel media options: \ref agora::rtc::ChannelMediaOptions::ChannelMediaOptions "ChannelMediaOptions" + * @param eventHandler The pointer to the IRtcEngine event handler: IRtcEngineEventHandler. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int joinChannelWithUserAccountEx(const char* token, const char* channelId, + const char* userAccount, const ChannelMediaOptions& options, + IRtcEngineEventHandler* eventHandler) = 0; + + /** Gets the user information by passing in the user account. + * + * After a remote user joins the channel, the SDK gets the user ID and user account of the remote user, caches them + * in a mapping table object (`userInfo`), and triggers the \ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" callback on the local client. + * + * After receiving the o\ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" callback, you can call this method to get the user ID of the + * remote user from the `userInfo` object by passing in the user account. + * + * @param userAccount The user account of the user. Ensure that you set this parameter. + * @param [in,out] userInfo A userInfo object that identifies the user: + * - Input: A userInfo object. + * - Output: A userInfo object that contains the user account and user ID of the user. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getUserInfoByUserAccount(const char* userAccount, rtc::UserInfo* userInfo) = 0; + + /** Gets the user information by passing in the user ID. + * + * After a remote user joins the channel, the SDK gets the user ID and user account of the remote user, + * caches them in a mapping table object (`userInfo`), and triggers the \ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" callback on the local client. + * + * After receiving the \ref agora::rtc::IRtcEngineEventHandler::onUserInfoUpdated "onUserInfoUpdated" callback, you can call this method to get the user account of the remote user + * from the `userInfo` object by passing in the user ID. + * + * @param uid The user ID of the remote user. Ensure that you set this parameter. + * @param[in,out] userInfo A userInfo object that identifies the user: + * - Input: A userInfo object. + * - Output: A userInfo object that contains the user account and user ID of the user. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getUserInfoByUid(uid_t uid, rtc::UserInfo* userInfo) = 0; + + /** Starts relaying media streams across channels or updates the channels for media relay. + * + * After a successful method call, the SDK triggers the + * \ref agora::rtc::IRtcEngineEventHandler::onChannelMediaRelayStateChanged + * "onChannelMediaRelayStateChanged" callback, and this callback return the state of the media stream relay. + * - If the + * \ref agora::rtc::IRtcEngineEventHandler::onChannelMediaRelayStateChanged + * "onChannelMediaRelayStateChanged" callback returns + * #RELAY_STATE_RUNNING (2) and #RELAY_OK (0), the host starts sending data to the destination channel. + * - If the + * \ref agora::rtc::IRtcEngineEventHandler::onChannelMediaRelayStateChanged + * "onChannelMediaRelayStateChanged" callback returns + * #RELAY_STATE_FAILURE (3), an exception occurs during the media stream + * relay. + * + * @note + * - Call this method after the \ref joinChannel() "joinChannel" method. + * - This method takes effect only when you are a host in a + * `LIVE_BROADCASTING` channel. + * - Contact sales-us@agora.io before implementing this function. + * - We do not support string user accounts in this API. + * + * @since v4.2.0 + * @param configuration The configuration of the media stream relay: + * ChannelMediaRelayConfiguration. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -8(ERR_INVALID_STATE): The current status is invalid, only allowed to be called when the role is the broadcaster. + **/ + virtual int startOrUpdateChannelMediaRelay(const ChannelMediaRelayConfiguration &configuration) = 0; + + /** Stops the media stream relay. + * + * Once the relay stops, the host quits all the destination + * channels. + * + * After a successful method call, the SDK triggers the + * \ref agora::rtc::IRtcEngineEventHandler::onChannelMediaRelayStateChanged + * "onChannelMediaRelayStateChanged" callback. If the callback returns + * #RELAY_STATE_IDLE (0) and #RELAY_OK (0), the host successfully + * stops the relay. + * + * @note + * If the method call fails, the SDK triggers the + * \ref agora::rtc::IRtcEngineEventHandler::onChannelMediaRelayStateChanged + * "onChannelMediaRelayStateChanged" callback with the + * #RELAY_ERROR_SERVER_NO_RESPONSE (2) or + * #RELAY_ERROR_SERVER_CONNECTION_LOST (8) state code. You can leave the + * channel by calling the \ref leaveChannel() "leaveChannel" method, and + * the media stream relay automatically stops. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -7(ERR_NOT_INITIALIZED): cross channel media streams are not relayed. + */ + virtual int stopChannelMediaRelay() = 0; + + /** pause the channels for media stream relay. + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -7(ERR_NOT_INITIALIZED): cross channel media streams are not relayed. + */ + virtual int pauseAllChannelMediaRelay() = 0; + + /** resume the channels for media stream relay. + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -7(ERR_NOT_INITIALIZED): cross channel media streams are not relayed. + */ + virtual int resumeAllChannelMediaRelay() = 0; + + /** Set audio parameters for direct streaming to CDN + * + * @note + * Must call this api before "startDirectCdnStreaming" + * + * @param profile Sets the sample rate, bitrate, encoding mode, and the number of channels: + * #AUDIO_PROFILE_TYPE. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDirectCdnStreamingAudioConfiguration(AUDIO_PROFILE_TYPE profile) = 0; + + /** Set video parameters for direct streaming to CDN + * + * Each configuration profile corresponds to a set of video parameters, including + * the resolution, frame rate, and bitrate. + * + * @note + * Must call this api before "startDirectCdnStreaming" + * + * @param config The local video encoder configuration: VideoEncoderConfiguration. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDirectCdnStreamingVideoConfiguration(const VideoEncoderConfiguration& config) = 0; + + /** Start direct cdn streaming + * + * @param eventHandler A pointer to the direct cdn streaming event handler: \ref agora::rtc::IDirectCdnStreamingEventHandler + * "IDirectCdnStreamingEventHandler". + * @param publishUrl The url of the cdn used to publish the stream. + * @param options The direct cdn streaming media options: DirectCdnStreamingMediaOptions. + * This API must pass an audio-related option, and temporarily cannot pass more than one. + * For video-related options, you can either choose to not pass any, or only one. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startDirectCdnStreaming(IDirectCdnStreamingEventHandler* eventHandler, + const char* publishUrl, const DirectCdnStreamingMediaOptions& options) = 0; + + /** Stop direct cdn streaming + * + * @note + * This method is synchronous. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopDirectCdnStreaming() = 0; + + /** Change the media source during the pushing + * + * @note + * This method is temporarily not supported. + * + * @param options The direct cdn streaming media options: DirectCdnStreamingMediaOptions. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int updateDirectCdnStreamingMediaOptions(const DirectCdnStreamingMediaOptions& options) = 0; + + /** Enables the rhythm player. + * + * @param sound1 The absolute path or URL address (including the filename extensions) of the file for the downbeat. + * @param sound2 The absolute path or URL address (including the filename extensions) of the file for the upbeats. + * @param config The configuration of rhythm player. + * + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int startRhythmPlayer(const char* sound1, const char* sound2, const AgoraRhythmPlayerConfig& config) = 0; + + /** Disables the rhythm player. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopRhythmPlayer() = 0; + + /** Configures the rhythm player. + * + * @param config The configuration of rhythm player. + * + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int configRhythmPlayer(const AgoraRhythmPlayerConfig& config) = 0; + + /** + * Takes a snapshot of a video stream. + * + * This method takes a snapshot of a video stream from the specified user, generates a JPG + * image, and saves it to the specified path. + * + * The method is asynchronous, and the SDK has not taken the snapshot when the method call + * returns. After a successful method call, the SDK triggers the `onSnapshotTaken` callback + * to report whether the snapshot is successfully taken, as well as the details for that + * snapshot. + * + * @note + * - Call this method after joining a channel. + * - This method takes a snapshot of the published video stream specified in `ChannelMediaOptions`. + * - If the user's video has been preprocessed, for example, watermarked or beautified, the resulting + * snapshot includes the pre-processing effect. + * + * @param uid The user ID. Set uid as 0 if you want to take a snapshot of the local user's video. + * @param filePath The local path (including filename extensions) of the snapshot. For example: + * - Windows: `C:\Users\\AppData\Local\Agora\\example.jpg` + * - iOS: `/App Sandbox/Library/Caches/example.jpg` + * - macOS: `锝/Library/Logs/example.jpg` + * - Android: `/storage/emulated/0/Android/data//files/example.jpg` + * + * Ensure that the path you specify exists and is writable. + * @return + * - 0 : Success. + * - < 0 : Failure. + */ + virtual int takeSnapshot(uid_t uid, const char* filePath) = 0; + + /** Enables the content inspect. + @param enabled Whether to enable content inspect: + - `true`: Yes. + - `false`: No. + @param config The configuration for the content inspection. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int enableContentInspect(bool enabled, const media::ContentInspectConfig &config) = 0; + /* + * Adjust the custom audio publish volume by track id. + * @param trackId custom audio track id. + * @param volume The volume, range is [0,100]: + * 0: mute, 100: The original volume + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int adjustCustomAudioPublishVolume(track_id_t trackId, int volume) = 0; + + /* + * Adjust the custom audio playout volume by track id. + * @param trackId custom audio track id. + * @param volume The volume, range is [0,100]: + * 0: mute, 100: The original volume + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int adjustCustomAudioPlayoutVolume(track_id_t trackId, int volume) = 0; + + /** Sets the Agora cloud proxy service. + * + * @since v3.3.0 + * + * When the user's firewall restricts the IP address and port, refer to *Use Cloud Proxy* to add the specific + * IP addresses and ports to the firewall allowlist; then, call this method to enable the cloud proxy and set + * the `proxyType` parameter as `UDP_PROXY(1)`, which is the cloud proxy for the UDP protocol. + * + * After a successfully cloud proxy connection, the SDK triggers + * the \ref IRtcEngineEventHandler::onConnectionStateChanged "onConnectionStateChanged" (CONNECTION_STATE_CONNECTING, CONNECTION_CHANGED_SETTING_PROXY_SERVER) callback. + * + * To disable the cloud proxy that has been set, call `setCloudProxy(NONE_PROXY)`. To change the cloud proxy type that has been set, + * call `setCloudProxy(NONE_PROXY)` first, and then call `setCloudProxy`, and pass the value that you expect in `proxyType`. + * + * @note + * - Agora recommends that you call this method before joining the channel or after leaving the channel. + * - For the SDK v3.3.x, the services for pushing streams to CDN and co-hosting across channels are not available + * when you use the cloud proxy for the UDP protocol. For the SDK v3.4.0 and later, the services for pushing streams + * to CDN and co-hosting across channels are not available when the user is in a network environment with a firewall + * and uses the cloud proxy for the UDP protocol. + * + * @param proxyType The cloud proxy type, see #CLOUD_PROXY_TYPE. This parameter is required, and the SDK reports an error if you do not pass in a value. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - `-2(ERR_INVALID_ARGUMENT)`: The parameter is invalid. + * - `-7(ERR_NOT_INITIALIZED)`: The SDK is not initialized. + */ + virtual int setCloudProxy(CLOUD_PROXY_TYPE proxyType) = 0; + /** set local access point addresses in local proxy mode. use this method before join channel. + + @param config The LocalAccessPointConfiguration class, See the definition of LocalAccessPointConfiguration for details. + + @return + - 0: Success + - < 0: Failure + */ + virtual int setLocalAccessPoint(const LocalAccessPointConfiguration& config) = 0; + + /** set advanced audio options. + @param options The AdvancedAudioOptions class, See the definition of AdvancedAudioOptions for details. + + @return + - 0: Success + - < 0: Failure + */ + virtual int setAdvancedAudioOptions(AdvancedAudioOptions& options, int sourceType = 0) = 0; + + /** Bind local user and a remote user as an audio&video sync group. The remote user is defined by cid and uid. + * There鈥檚 a usage limit that local user must be a video stream sender. On the receiver side, media streams from same sync group will be time-synced + * + * @param channelId The channel id + * @param uid The user ID of the remote user to be bound with (local user) + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setAVSyncSource(const char* channelId, uid_t uid) = 0; + + /** + * @brief enable or disable video image source to replace the current video source published or resume it + * + * @param enable true for enable, false for disable + * @param options options for image track + */ + virtual int enableVideoImageSource(bool enable, const ImageTrackOptions& options) = 0; + + /* + * Get monotonic time in ms which can be used by capture time, + * typical scenario is as follows: + * + * ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + * | // custom audio/video base capture time, e.g. the first audio/video capture time. | + * | int64_t custom_capture_time_base; | + * | | + * | int64_t agora_monotonic_time = getCurrentMonotonicTimeInMs(); | + * | | + * | // offset is fixed once calculated in the begining. | + * | const int64_t offset = agora_monotonic_time - custom_capture_time_base; | + * | | + * | // realtime_custom_audio/video_capture_time is the origin capture time that customer provided.| + * | // actual_audio/video_capture_time is the actual capture time transfered to sdk. | + * | int64_t actual_audio_capture_time = realtime_custom_audio_capture_time + offset; | + * | int64_t actual_video_capture_time = realtime_custom_video_capture_time + offset; | + * ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ + * + * @return + * - >= 0: Success. + * - < 0: Failure. + */ + virtual int64_t getCurrentMonotonicTimeInMs() = 0; + + /** + * Turns WIFI acceleration on or off. + * + * @note + * - This method is called before and after joining a channel. + * - Users check the WIFI router app for information about acceleration. Therefore, if this interface is invoked, the caller accepts that the caller's name will be displayed to the user in the WIFI router application on behalf of the caller. + * + * @param enabled + * - true锛歍urn WIFI acceleration on. + * - false锛歍urn WIFI acceleration off. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableWirelessAccelerate(bool enabled) = 0; + + /** + * get network type value + * + * @return + * - 0: DISCONNECTED. + * - 1: LAN. + * - 2: WIFI. + * - 3: MOBILE_2G. + * - 4: MOBILE_3G. + * - 5: MOBILE_4G. + * - 6: MOBILE_5G. + * - -1: UNKNOWN. + */ + + virtual int getNetworkType() = 0; + + /** Provides the technical preview functionalities or special customizations by configuring the SDK with JSON options. + + @param parameters Pointer to the set parameters in a JSON string. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setParameters(const char* parameters) = 0; + + /** + @brief Start tracing media rendering events. + @since v4.1.1 + @discussion + - SDK will trace media rendering events when this API is called. + - The tracing result can be obtained through callback `IRtcEngineEventHandler::onVideoRenderingTracingResult` + @note + - By default, SDK will trace media rendering events when `IRtcEngine::joinChannel` is called. + - The start point of event tracing will be reset after leaving channel. + @return + - 0: Success. + - < 0: Failure. + - -7(ERR_NOT_INITIALIZED): The SDK is not initialized. Initialize the `IRtcEngine` instance before calling this method. + */ + virtual int startMediaRenderingTracing() = 0; + + /** + @brief Enable instant media rendering. + @since v4.1.1 + @discussion + - This method enable SDK to render video or playout audio faster. + @note + - Once enable this mode, we should destroy rtc engine to disable it. + - Enable this mode, will sacrifice some part of experience. + @return + - 0: Success. + - < 0: Failure. + - -7(ERR_NOT_INITIALIZED): The SDK is not initialized. Initialize the `IRtcEngine` instance before calling this method. + */ + virtual int enableInstantMediaRendering() = 0; + + /** + * Return current NTP(unix timestamp) time in milliseconds. + */ + virtual uint64_t getNtpWallTimeInMs() = 0; + + /** + * @brief Whether the target feature is available for the device. + * @since v4.3.0 + * @param type The feature type. See FeatureType. + * @return + * - true: available. + * - false: not available. + */ + virtual bool isFeatureAvailableOnDevice(FeatureType type) = 0; +}; + +// The following types are either deprecated or not implmented yet. +enum QUALITY_REPORT_FORMAT_TYPE { + /** 0: The quality report in JSON format, + */ + QUALITY_REPORT_JSON = 0, + /** 1: The quality report in HTML format. + */ + QUALITY_REPORT_HTML = 1, +}; + +/** Media device states. */ +enum MEDIA_DEVICE_STATE_TYPE { + /** 0: The device is ready for use. + */ + MEDIA_DEVICE_STATE_IDLE = 0, + /** 1: The device is active. + */ + MEDIA_DEVICE_STATE_ACTIVE = 1, + /** 2: The device is disabled. + */ + MEDIA_DEVICE_STATE_DISABLED = 2, + /** 4: The device is not present. + */ + MEDIA_DEVICE_STATE_NOT_PRESENT = 4, + /** 8: The device is unplugged. + */ + MEDIA_DEVICE_STATE_UNPLUGGED = 8 +}; + +enum VIDEO_PROFILE_TYPE { + /** 0: 160 x 120 @ 15 fps */ // res fps + VIDEO_PROFILE_LANDSCAPE_120P = 0, // 160x120 15 + /** 2: 120 x 120 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_120P_3 = 2, // 120x120 15 + /** 10: 320 x 180 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_180P = 10, // 320x180 15 + /** 12: 180 x 180 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_180P_3 = 12, // 180x180 15 + /** 13: 240 x 180 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_180P_4 = 13, // 240x180 15 + /** 20: 320 x 240 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_240P = 20, // 320x240 15 + /** 22: 240 x 240 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_240P_3 = 22, // 240x240 15 + /** 23: 424 x 240 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_240P_4 = 23, // 424x240 15 + /** 30: 640 x 360 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_360P = 30, // 640x360 15 + /** 32: 360 x 360 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_3 = 32, // 360x360 15 + /** 33: 640 x 360 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_4 = 33, // 640x360 30 + /** 35: 360 x 360 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_6 = 35, // 360x360 30 + /** 36: 480 x 360 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_7 = 36, // 480x360 15 + /** 37: 480 x 360 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_8 = 37, // 480x360 30 + /** 38: 640 x 360 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_9 = 38, // 640x360 15 + /** 39: 640 x 360 @ 24 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_10 = 39, // 640x360 24 + /** 100: 640 x 360 @ 24 fps */ + VIDEO_PROFILE_LANDSCAPE_360P_11 = 100, // 640x360 24 + /** 40: 640 x 480 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_480P = 40, // 640x480 15 + /** 42: 480 x 480 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_480P_3 = 42, // 480x480 15 + /** 43: 640 x 480 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_480P_4 = 43, // 640x480 30 + /** 45: 480 x 480 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_480P_6 = 45, // 480x480 30 + /** 47: 848 x 480 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_480P_8 = 47, // 848x480 15 + /** 48: 848 x 480 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_480P_9 = 48, // 848x480 30 + /** 49: 640 x 480 @ 10 fps */ + VIDEO_PROFILE_LANDSCAPE_480P_10 = 49, // 640x480 10 + /** 50: 1280 x 720 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_720P = 50, // 1280x720 15 + /** 52: 1280 x 720 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_720P_3 = 52, // 1280x720 30 + /** 54: 960 x 720 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_720P_5 = 54, // 960x720 15 + /** 55: 960 x 720 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_720P_6 = 55, // 960x720 30 + /** 60: 1920 x 1080 @ 15 fps */ + VIDEO_PROFILE_LANDSCAPE_1080P = 60, // 1920x1080 15 + /** 62: 1920 x 1080 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_1080P_3 = 62, // 1920x1080 30 + /** 64: 1920 x 1080 @ 60 fps */ + VIDEO_PROFILE_LANDSCAPE_1080P_5 = 64, // 1920x1080 60 + /** 66: 2560 x 1440 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_1440P = 66, // 2560x1440 30 + /** 67: 2560 x 1440 @ 60 fps */ + VIDEO_PROFILE_LANDSCAPE_1440P_2 = 67, // 2560x1440 60 + /** 70: 3840 x 2160 @ 30 fps */ + VIDEO_PROFILE_LANDSCAPE_4K = 70, // 3840x2160 30 + /** 72: 3840 x 2160 @ 60 fps */ + VIDEO_PROFILE_LANDSCAPE_4K_3 = 72, // 3840x2160 60 + /** 1000: 120 x 160 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_120P = 1000, // 120x160 15 + /** 1002: 120 x 120 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_120P_3 = 1002, // 120x120 15 + /** 1010: 180 x 320 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_180P = 1010, // 180x320 15 + /** 1012: 180 x 180 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_180P_3 = 1012, // 180x180 15 + /** 1013: 180 x 240 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_180P_4 = 1013, // 180x240 15 + /** 1020: 240 x 320 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_240P = 1020, // 240x320 15 + /** 1022: 240 x 240 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_240P_3 = 1022, // 240x240 15 + /** 1023: 240 x 424 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_240P_4 = 1023, // 240x424 15 + /** 1030: 360 x 640 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_360P = 1030, // 360x640 15 + /** 1032: 360 x 360 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_360P_3 = 1032, // 360x360 15 + /** 1033: 360 x 640 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_360P_4 = 1033, // 360x640 30 + /** 1035: 360 x 360 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_360P_6 = 1035, // 360x360 30 + /** 1036: 360 x 480 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_360P_7 = 1036, // 360x480 15 + /** 1037: 360 x 480 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_360P_8 = 1037, // 360x480 30 + /** 1038: 360 x 640 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_360P_9 = 1038, // 360x640 15 + /** 1039: 360 x 640 @ 24 fps */ + VIDEO_PROFILE_PORTRAIT_360P_10 = 1039, // 360x640 24 + /** 1100: 360 x 640 @ 24 fps */ + VIDEO_PROFILE_PORTRAIT_360P_11 = 1100, // 360x640 24 + /** 1040: 480 x 640 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_480P = 1040, // 480x640 15 + /** 1042: 480 x 480 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_480P_3 = 1042, // 480x480 15 + /** 1043: 480 x 640 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_480P_4 = 1043, // 480x640 30 + /** 1045: 480 x 480 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_480P_6 = 1045, // 480x480 30 + /** 1047: 480 x 848 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_480P_8 = 1047, // 480x848 15 + /** 1048: 480 x 848 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_480P_9 = 1048, // 480x848 30 + /** 1049: 480 x 640 @ 10 fps */ + VIDEO_PROFILE_PORTRAIT_480P_10 = 1049, // 480x640 10 + /** 1050: 720 x 1280 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_720P = 1050, // 720x1280 15 + /** 1052: 720 x 1280 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_720P_3 = 1052, // 720x1280 30 + /** 1054: 720 x 960 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_720P_5 = 1054, // 720x960 15 + /** 1055: 720 x 960 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_720P_6 = 1055, // 720x960 30 + /** 1060: 1080 x 1920 @ 15 fps */ + VIDEO_PROFILE_PORTRAIT_1080P = 1060, // 1080x1920 15 + /** 1062: 1080 x 1920 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_1080P_3 = 1062, // 1080x1920 30 + /** 1064: 1080 x 1920 @ 60 fps */ + VIDEO_PROFILE_PORTRAIT_1080P_5 = 1064, // 1080x1920 60 + /** 1066: 1440 x 2560 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_1440P = 1066, // 1440x2560 30 + /** 1067: 1440 x 2560 @ 60 fps */ + VIDEO_PROFILE_PORTRAIT_1440P_2 = 1067, // 1440x2560 60 + /** 1070: 2160 x 3840 @ 30 fps */ + VIDEO_PROFILE_PORTRAIT_4K = 1070, // 2160x3840 30 + /** 1072: 2160 x 3840 @ 60 fps */ + VIDEO_PROFILE_PORTRAIT_4K_3 = 1072, // 2160x3840 60 + /** Default 640 x 360 @ 15 fps */ + VIDEO_PROFILE_DEFAULT = VIDEO_PROFILE_LANDSCAPE_360P, +}; + +class AAudioDeviceManager : public agora::util::AutoPtr { + public: + AAudioDeviceManager(IRtcEngine* engine) { + queryInterface(engine, AGORA_IID_AUDIO_DEVICE_MANAGER); + } +}; + +class AVideoDeviceManager : public agora::util::AutoPtr { + public: + AVideoDeviceManager(IRtcEngine* engine) { + queryInterface(engine, AGORA_IID_VIDEO_DEVICE_MANAGER); + } +}; + +} // namespace rtc +} // namespace agora + +/** Gets the SDK version number. + +@param build Build number of Agora the SDK. +* @return String of the SDK version. +*/ +#define getAgoraRtcEngineVersion getAgoraSdkVersion + +//////////////////////////////////////////////////////// +/** \addtogroup createAgoraRtcEngine + @{ + */ +//////////////////////////////////////////////////////// + +/** Creates the RTC engine object and returns the pointer. + +* @return Pointer of the RTC engine object. +*/ +AGORA_API agora::rtc::IRtcEngine* AGORA_CALL createAgoraRtcEngine(); + +//////////////////////////////////////////////////////// +/** @} */ +//////////////////////////////////////////////////////// + +/** Creates the RTC engine object and returns the pointer. + + @param err Error Code. +* @return Description of the Error Code: agora::ERROR_CODE_TYPE +*/ +#define getAgoraRtcEngineErrorDescription getAgoraSdkErrorDescription +#define setAgoraRtcEngineExternalSymbolLoader setAgoraSdkExternalSymbolLoader diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngineEx.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngineEx.h new file mode 100644 index 000000000..e9826d78f --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraRtcEngineEx.h @@ -0,0 +1,1961 @@ +// +// Agora Media SDK +// +// Created by Sting Feng in 2015-05. +// Updated by Tommy Miao in 2020-11. +// Copyright (c) 2015 Agora IO. All rights reserved. +// +#pragma once + +#include "IAgoraRtcEngine.h" + +namespace agora { +namespace rtc { + +// OPTIONAL_ENUM_CLASS RTC_EVENT; + +/** + * Rtc Connection. + */ +struct RtcConnection { + /** + * The unique channel name for the AgoraRTC session in the string format. The string + * length must be less than 64 bytes. Supported character scopes are: + * - All lowercase English letters: a to z. + * - All uppercase English letters: A to Z. + * - All numeric characters: 0 to 9. + * - The space character. + * - Punctuation characters and other symbols, including: "!", "#", "$", "%", "&", "(", ")", "+", "-", + * ":", ";", "<", "=", ".", ">", "?", "@", "[", "]", "^", "_", " {", "}", "|", "~", ",". + */ + const char* channelId; + /** + * User ID: A 32-bit unsigned integer ranging from 1 to (2^32-1). It must be unique. + */ + uid_t localUid; + + RtcConnection() : channelId(NULL), localUid(0) {} + RtcConnection(const char* channel_id, uid_t local_uid) + : channelId(channel_id), localUid(local_uid) {} +}; + +class IRtcEngineEventHandlerEx : public IRtcEngineEventHandler { + public: + using IRtcEngineEventHandler::eventHandlerType; + using IRtcEngineEventHandler::onJoinChannelSuccess; + using IRtcEngineEventHandler::onRejoinChannelSuccess; + using IRtcEngineEventHandler::onAudioQuality; + using IRtcEngineEventHandler::onAudioVolumeIndication; + using IRtcEngineEventHandler::onLeaveChannel; + using IRtcEngineEventHandler::onRtcStats; + using IRtcEngineEventHandler::onNetworkQuality; + using IRtcEngineEventHandler::onIntraRequestReceived; + using IRtcEngineEventHandler::onFirstLocalVideoFramePublished; + using IRtcEngineEventHandler::onFirstRemoteVideoDecoded; + using IRtcEngineEventHandler::onVideoSizeChanged; + using IRtcEngineEventHandler::onLocalVideoStateChanged; + using IRtcEngineEventHandler::onRemoteVideoStateChanged; + using IRtcEngineEventHandler::onFirstRemoteVideoFrame; + using IRtcEngineEventHandler::onUserJoined; + using IRtcEngineEventHandler::onUserOffline; + using IRtcEngineEventHandler::onUserMuteAudio; + using IRtcEngineEventHandler::onUserMuteVideo; + using IRtcEngineEventHandler::onUserEnableVideo; + using IRtcEngineEventHandler::onUserEnableLocalVideo; + using IRtcEngineEventHandler::onUserStateChanged; + using IRtcEngineEventHandler::onLocalAudioStats; + using IRtcEngineEventHandler::onRemoteAudioStats; + using IRtcEngineEventHandler::onLocalVideoStats; + using IRtcEngineEventHandler::onRemoteVideoStats; + using IRtcEngineEventHandler::onConnectionLost; + using IRtcEngineEventHandler::onConnectionInterrupted; + using IRtcEngineEventHandler::onConnectionBanned; + using IRtcEngineEventHandler::onStreamMessage; + using IRtcEngineEventHandler::onStreamMessageError; + using IRtcEngineEventHandler::onRequestToken; + using IRtcEngineEventHandler::onTokenPrivilegeWillExpire; + using IRtcEngineEventHandler::onLicenseValidationFailure; + using IRtcEngineEventHandler::onFirstLocalAudioFramePublished; + using IRtcEngineEventHandler::onFirstRemoteAudioFrame; + using IRtcEngineEventHandler::onFirstRemoteAudioDecoded; + using IRtcEngineEventHandler::onLocalAudioStateChanged; + using IRtcEngineEventHandler::onRemoteAudioStateChanged; + using IRtcEngineEventHandler::onActiveSpeaker; + using IRtcEngineEventHandler::onClientRoleChanged; + using IRtcEngineEventHandler::onClientRoleChangeFailed; + using IRtcEngineEventHandler::onRemoteAudioTransportStats; + using IRtcEngineEventHandler::onRemoteVideoTransportStats; + using IRtcEngineEventHandler::onConnectionStateChanged; + using IRtcEngineEventHandler::onWlAccMessage; + using IRtcEngineEventHandler::onWlAccStats; + using IRtcEngineEventHandler::onNetworkTypeChanged; + using IRtcEngineEventHandler::onEncryptionError; + using IRtcEngineEventHandler::onUploadLogResult; + using IRtcEngineEventHandler::onUserInfoUpdated; + using IRtcEngineEventHandler::onUserAccountUpdated; + using IRtcEngineEventHandler::onAudioSubscribeStateChanged; + using IRtcEngineEventHandler::onVideoSubscribeStateChanged; + using IRtcEngineEventHandler::onAudioPublishStateChanged; + using IRtcEngineEventHandler::onVideoPublishStateChanged; + using IRtcEngineEventHandler::onSnapshotTaken; + using IRtcEngineEventHandler::onVideoRenderingTracingResult; + using IRtcEngineEventHandler::onSetRtmFlagResult; + using IRtcEngineEventHandler::onTranscodedStreamLayoutInfo; + + virtual const char* eventHandlerType() const { return "event_handler_ex"; } + + /** + * Occurs when a user joins a channel. + * + * This callback notifies the application that a user joins a specified channel. + * + * @param connection The RtcConnection object. + * @param elapsed The time elapsed (ms) from the local user calling joinChannel until the SDK triggers this callback. + */ + virtual void onJoinChannelSuccess(const RtcConnection& connection, int elapsed) { + (void)connection; + (void)elapsed; + } + + /** + * Occurs when a user rejoins the channel. + * + * When a user loses connection with the server because of network problems, the SDK automatically tries to reconnect + * and triggers this callback upon reconnection. + * + * @param connection The RtcConnection object. + * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until this callback is triggered. + */ + virtual void onRejoinChannelSuccess(const RtcConnection& connection, int elapsed) { + (void)connection; + (void)elapsed; + } + + /** Reports the statistics of the audio stream from each remote + user/broadcaster. + + @deprecated This callback is deprecated. Use onRemoteAudioStats instead. + + The SDK triggers this callback once every two seconds to report the audio + quality of each remote user/host sending an audio stream. If a channel has + multiple remote users/hosts sending audio streams, the SDK triggers this + callback as many times. + + @param connection The RtcConnection object. + @param remoteUid The user ID of the remote user sending the audio stream. + @param quality The audio quality of the user: #QUALITY_TYPE + @param delay The network delay (ms) from the sender to the receiver, including the delay caused by audio sampling pre-processing, network transmission, and network jitter buffering. + @param lost The audio packet loss rate (%) from the sender to the receiver. + */ + virtual void onAudioQuality(const RtcConnection& connection, uid_t remoteUid, int quality, unsigned short delay, unsigned short lost) __deprecated { + (void)connection; + (void)remoteUid; + (void)quality; + (void)delay; + (void)lost; + } + /** + * Reports the volume information of users. + * + * By default, this callback is disabled. You can enable it by calling `enableAudioVolumeIndication`. Once this + * callback is enabled and users send streams in the channel, the SDK triggers the `onAudioVolumeIndication` + * callback at the time interval set in `enableAudioVolumeIndication`. The SDK triggers two independent + * `onAudioVolumeIndication` callbacks simultaneously, which separately report the volume information of the + * local user who sends a stream and the remote users (up to three) whose instantaneous volume is the highest. + * + * @note After you enable this callback, calling muteLocalAudioStream affects the SDK's behavior as follows: + * - If the local user stops publishing the audio stream, the SDK stops triggering the local user's callback. + * - 20 seconds after a remote user whose volume is one of the three highest stops publishing the audio stream, + * the callback excludes this user's information; 20 seconds after all remote users stop publishing audio streams, + * the SDK stops triggering the callback for remote users. + * + * @param connection The RtcConnection object. + * @param speakers The volume information of the users, see AudioVolumeInfo. An empty `speakers` array in the + * callback indicates that no remote user is in the channel or sending a stream at the moment. + * @param speakerNumber The total number of speakers. + * - In the local user's callback, when the local user sends a stream, `speakerNumber` is 1. + * - In the callback for remote users, the value range of speakerNumber is [0,3]. If the number of remote users who + * send streams is greater than or equal to three, the value of `speakerNumber` is 3. + * @param totalVolume The volume of the speaker. The value ranges between 0 (lowest volume) and 255 (highest volume). + * - In the local user's callback, `totalVolume` is the volume of the local user who sends a stream. + * - In the remote users' callback, `totalVolume` is the sum of all remote users (up to three) whose instantaneous + * volume is the highest. If the user calls `startAudioMixing`, `totalVolume` is the volume after audio mixing. + */ + virtual void onAudioVolumeIndication(const RtcConnection& connection, const AudioVolumeInfo* speakers, + unsigned int speakerNumber, int totalVolume) { + (void)connection; + (void)speakers; + (void)speakerNumber; + (void)totalVolume; + } + + /** + * Occurs when a user leaves a channel. + * + * This callback notifies the app that the user leaves the channel by calling `leaveChannel`. From this callback, + * the app can get information such as the call duration and quality statistics. + * + * @param connection The RtcConnection object. + * @param stats The statistics on the call: RtcStats. + */ + virtual void onLeaveChannel(const RtcConnection& connection, const RtcStats& stats) { + (void)connection; + (void)stats; + } + + /** + * Reports the statistics of the current call. + * + * The SDK triggers this callback once every two seconds after the user joins the channel. + * + * @param connection The RtcConnection object. + * @param stats The statistics of the current call: RtcStats. + */ + virtual void onRtcStats(const RtcConnection& connection, const RtcStats& stats) { + (void)connection; + (void)stats; + } + + /** + * Reports the last mile network quality of each user in the channel. + * + * This callback reports the last mile network conditions of each user in the channel. Last mile refers to the + * connection between the local device and Agora's edge server. + * + * The SDK triggers this callback once every two seconds. If a channel includes multiple users, the SDK triggers + * this callback as many times. + * + * @note `txQuality` is UNKNOWN when the user is not sending a stream; `rxQuality` is UNKNOWN when the user is not + * receiving a stream. + * + * @param connection The RtcConnection object. + * @param remoteUid The user ID. The network quality of the user with this user ID is reported. + * @param txQuality Uplink network quality rating of the user in terms of the transmission bit rate, packet loss rate, + * average RTT (Round-Trip Time) and jitter of the uplink network. This parameter is a quality rating helping you + * understand how well the current uplink network conditions can support the selected video encoder configuration. + * For example, a 1000 Kbps uplink network may be adequate for video frames with a resolution of 640 脳 480 and a frame + * rate of 15 fps in the LIVE_BROADCASTING profile, but may be inadequate for resolutions higher than 1280 脳 720. + * See #QUALITY_TYPE. + * @param rxQuality Downlink network quality rating of the user in terms of packet loss rate, average RTT, and jitter + * of the downlink network. See #QUALITY_TYPE. + */ + virtual void onNetworkQuality(const RtcConnection& connection, uid_t remoteUid, int txQuality, int rxQuality) { + (void)connection; + (void)remoteUid; + (void)txQuality; + (void)rxQuality; + } + + /** + * Occurs when intra request from remote user is received. + * + * This callback is triggered once remote user needs one Key frame. + * + * @param connection The RtcConnection object. + */ + virtual void onIntraRequestReceived(const RtcConnection& connection) { + (void)connection; + } + + /** Occurs when the first local video frame is published. + * The SDK triggers this callback under one of the following circumstances: + * - The local client enables the video module and calls `joinChannel` successfully. + * - The local client calls `muteLocalVideoStream(true)` and muteLocalVideoStream(false) in sequence. + * - The local client calls `disableVideo` and `enableVideo` in sequence. + * - The local client calls `pushVideoFrame` to successfully push the video frame to the SDK. + * + * @param connection The RtcConnection object. + * @param elapsed The time elapsed (ms) from the local user calling joinChannel` to the SDK triggers + * this callback. + */ + virtual void onFirstLocalVideoFramePublished(const RtcConnection& connection, int elapsed) { + (void)connection; + (void)elapsed; + } + + /** Occurs when the first remote video frame is received and decoded. + + The SDK triggers this callback under one of the following circumstances: + - The remote user joins the channel and sends the video stream. + - The remote user stops sending the video stream and re-sends it after 15 seconds. Reasons for such an interruption include: + - The remote user leaves the channel. + - The remote user drops offline. + - The remote user calls `muteLocalVideoStream` to stop sending the video stream. + - The remote user calls `disableVideo` to disable video. + + @param connection The RtcConnection object. + @param remoteUid The user ID of the remote user sending the video stream. + @param width The width (pixels) of the video stream. + @param height The height (pixels) of the video stream. + @param elapsed The time elapsed (ms) from the local user calling `joinChannel` + until the SDK triggers this callback. + */ + virtual void onFirstRemoteVideoDecoded(const RtcConnection& connection, uid_t remoteUid, int width, int height, int elapsed) { + (void)connection; + (void)remoteUid; + (void)width; + (void)height; + (void)elapsed; + } + + /** + * Occurs when the local or remote video size or rotation has changed. + * + * @param connection The RtcConnection object. + * @param sourceType The video source type: #VIDEO_SOURCE_TYPE. + * @param uid The user ID. 0 indicates the local user. + * @param width The new width (pixels) of the video. + * @param height The new height (pixels) of the video. + * @param rotation The rotation information of the video. + */ + virtual void onVideoSizeChanged(const RtcConnection& connection, VIDEO_SOURCE_TYPE sourceType, uid_t uid, int width, int height, int rotation) { + (void)connection; + (void)sourceType; + (void)uid; + (void)width; + (void)height; + (void)rotation; + } + + /** Occurs when the local video stream state changes. + * + * When the state of the local video stream changes (including the state of the video capture and + * encoding), the SDK triggers this callback to report the current state. This callback indicates + * the state of the local video stream, including camera capturing and video encoding, and allows + * you to troubleshoot issues when exceptions occur. + * + * The SDK triggers the onLocalVideoStateChanged callback with the state code of `LOCAL_VIDEO_STREAM_STATE_FAILED` + * and error code of `LOCAL_VIDEO_STREAM_REASON_CAPTURE_FAILURE` in the following situations: + * - The app switches to the background, and the system gets the camera resource. + * - The camera starts normally, but does not output video for four consecutive seconds. + * + * When the camera outputs the captured video frames, if the video frames are the same for 15 + * consecutive frames, the SDK triggers the `onLocalVideoStateChanged` callback with the state code + * of `LOCAL_VIDEO_STREAM_STATE_CAPTURING` and error code of `LOCAL_VIDEO_STREAM_REASON_CAPTURE_FAILURE`. + * Note that the video frame duplication detection is only available for video frames with a resolution + * greater than 200 脳 200, a frame rate greater than or equal to 10 fps, and a bitrate less than 20 Kbps. + * + * @note For some device models, the SDK does not trigger this callback when the state of the local + * video changes while the local video capturing device is in use, so you have to make your own + * timeout judgment. + * + * @param connection The RtcConnection object. + * @param state The state of the local video. See #LOCAL_VIDEO_STREAM_STATE. + * @param reason The detailed error information. See #LOCAL_VIDEO_STREAM_REASON. + */ + virtual void onLocalVideoStateChanged(const RtcConnection& connection, + LOCAL_VIDEO_STREAM_STATE state, + LOCAL_VIDEO_STREAM_REASON reason) { + (void)connection; + (void)state; + (void)reason; + } + + /** + * Occurs when the remote video state changes. + * + * @note This callback does not work properly when the number of users (in the voice/video call + * channel) or hosts (in the live streaming channel) in the channel exceeds 17. + * + * @param connection The RtcConnection object. + * @param remoteUid The ID of the user whose video state has changed. + * @param state The remote video state: #REMOTE_VIDEO_STATE. + * @param reason The reason of the remote video state change: #REMOTE_VIDEO_STATE_REASON. + * @param elapsed The time elapsed (ms) from the local client calling `joinChannel` until this callback is triggered. + */ + virtual void onRemoteVideoStateChanged(const RtcConnection& connection, uid_t remoteUid, REMOTE_VIDEO_STATE state, REMOTE_VIDEO_STATE_REASON reason, int elapsed) { + (void)connection; + (void)remoteUid; + (void)state; + (void)reason; + (void)elapsed; + } + + /** Occurs when the renderer receives the first frame of the remote video. + * + * @param connection The RtcConnection object. + * @param remoteUid The user ID of the remote user sending the video stream. + * @param width The width (px) of the video frame. + * @param height The height (px) of the video stream. + * @param elapsed The time elapsed (ms) from the local user calling `joinChannel` until the SDK triggers this callback. + */ + virtual void onFirstRemoteVideoFrame(const RtcConnection& connection, uid_t remoteUid, int width, int height, int elapsed) { + (void)connection; + (void)remoteUid; + (void)width; + (void)height; + (void)elapsed; + } + + /** + * Occurs when a remote user or broadcaster joins the channel. + * + * - In the COMMUNICATION channel profile, this callback indicates that a remote user joins the channel. + * The SDK also triggers this callback to report the existing users in the channel when a user joins the + * channel. + * In the LIVE_BROADCASTING channel profile, this callback indicates that a host joins the channel. The + * SDK also triggers this callback to report the existing hosts in the channel when a host joins the + * channel. Agora recommends limiting the number of hosts to 17. + * + * The SDK triggers this callback under one of the following circumstances: + * - A remote user/host joins the channel by calling the `joinChannel` method. + * - A remote user switches the user role to the host after joining the channel. + * - A remote user/host rejoins the channel after a network interruption. + * + * @param connection The RtcConnection object. + * @param remoteUid The ID of the remote user or broadcaster joining the channel. + * @param elapsed The time elapsed (ms) from the local user calling `joinChannel` or `setClientRole` + * until this callback is triggered. + */ + virtual void onUserJoined(const RtcConnection& connection, uid_t remoteUid, int elapsed) { + (void)connection; + (void)remoteUid; + (void)elapsed; + } + + /** + * Occurs when a remote user or broadcaster goes offline. + * + * There are two reasons for a user to go offline: + * - Leave the channel: When the user leaves the channel, the user sends a goodbye message. When this + * message is received, the SDK determines that the user leaves the channel. + * - Drop offline: When no data packet of the user is received for a certain period of time, the SDK assumes + * that the user drops offline. A poor network connection may lead to false detection, so we recommend using + * the RTM SDK for reliable offline detection. + * - The user switches the user role from a broadcaster to an audience. + * + * @param connection The RtcConnection object. + * @param remoteUid The ID of the remote user or broadcaster who leaves the channel or drops offline. + * @param reason The reason why the remote user goes offline: #USER_OFFLINE_REASON_TYPE. + */ + virtual void onUserOffline(const RtcConnection& connection, uid_t remoteUid, USER_OFFLINE_REASON_TYPE reason) { + (void)connection; + (void)remoteUid; + (void)reason; + } + + /** Occurs when a remote user's audio stream playback pauses/resumes. + * The SDK triggers this callback when the remote user stops or resumes sending the audio stream by + * calling the `muteLocalAudioStream` method. + * @note This callback can be inaccurate when the number of users (in the `COMMUNICATION` profile) or hosts (in the `LIVE_BROADCASTING` profile) in the channel exceeds 17. + * + * @param connection The RtcConnection object. + * @param remoteUid The user ID. + * @param muted Whether the remote user's audio stream is muted/unmuted: + * - true: Muted. + * - false: Unmuted. + */ + virtual void onUserMuteAudio(const RtcConnection& connection, uid_t remoteUid, bool muted) __deprecated { + (void)connection; + (void)remoteUid; + (void)muted; + } + + /** Occurs when a remote user pauses or resumes sending the video stream. + * + * When a remote user calls `muteLocalVideoStream` to stop or resume publishing the video stream, the + * SDK triggers this callback to report the state of the remote user's publishing stream to the local + * user. + * + * @note This callback can be inaccurate when the number of users or broadacasters in a + * channel exceeds 20. + * + * @param connection The RtcConnection object. + * @param remoteUid ID of the remote user. + * @param muted Whether the remote user stops publishing the video stream: + * - true: The remote user has paused sending the video stream. + * - false: The remote user has resumed sending the video stream. + */ + virtual void onUserMuteVideo(const RtcConnection& connection, uid_t remoteUid, bool muted) { + (void)connection; + (void)remoteUid; + (void)muted; + } + + /** Occurs when a remote user enables or disables the video module. + + Once the video function is disabled, the users cannot see any video. + + The SDK triggers this callback when a remote user enables or disables the video module by calling the + `enableVideo` or `disableVideo` method. + + @param connection The RtcConnection object. + @param remoteUid The ID of the remote user. + @param enabled Whether the video of the remote user is enabled: + - true: The remote user has enabled video. + - false: The remote user has disabled video. + */ + virtual void onUserEnableVideo(const RtcConnection& connection, uid_t remoteUid, bool enabled) { + (void)connection; + (void)remoteUid; + (void)enabled; + } + + /** Occurs when a remote user enables or disables local video capturing. + + The SDK triggers this callback when the remote user resumes or stops capturing the video stream by + calling the `enableLocalVideo` method. + + @param connection The RtcConnection object. + @param remoteUid The ID of the remote user. + @param enabled Whether the specified remote user enables/disables local video: + - `true`: The remote user has enabled local video capturing. + - `false`: The remote user has disabled local video capturing. + */ + virtual void onUserEnableLocalVideo(const RtcConnection& connection, uid_t remoteUid, bool enabled) __deprecated { + (void)connection; + (void)remoteUid; + (void)enabled; + } + + /** + * Occurs when the remote user state is updated. + * + * @param connection The RtcConnection object. + * @param remoteUid The uid of the remote user. + * @param state The remote user state: #REMOTE_USER_STATE. + */ + virtual void onUserStateChanged(const RtcConnection& connection, uid_t remoteUid, uint32_t state) { + (void)connection; + (void)remoteUid; + (void)state; + } + + /** Reports the statistics of the local audio stream. + * + * The SDK triggers this callback once every two seconds. + * + * @param connection The RtcConnection object. + * @param stats The statistics of the local audio stream. + * See LocalAudioStats. + */ + virtual void onLocalAudioStats(const RtcConnection& connection, const LocalAudioStats& stats) { + (void)connection; + (void)stats; + } + + /** Reports the statistics of the audio stream from each remote user/host. + + The SDK triggers this callback once every two seconds for each remote user who is sending audio + streams. If a channel includes multiple remote users, the SDK triggers this callback as many times. + @param connection The RtcConnection object. + @param stats Statistics of the received remote audio streams. See RemoteAudioStats. + */ + virtual void onRemoteAudioStats(const RtcConnection& connection, const RemoteAudioStats& stats) { + (void)connection; + (void)stats; + } + + /** Reports the statistics of the local video stream. + * + * The SDK triggers this callback once every two seconds for each + * user/host. If there are multiple users/hosts in the channel, the SDK + * triggers this callback as many times. + * + * @note If you have called the `enableDualStreamMode` + * method, this callback reports the statistics of the high-video + * stream (high bitrate, and high-resolution video stream). + * + * @param connection The RtcConnection object. + * @param stats Statistics of the local video stream. See LocalVideoStats. + */ + virtual void onLocalVideoStats(const RtcConnection& connection, const LocalVideoStats& stats) { + (void)connection; + (void)stats; + } + + /** Reports the statistics of the video stream from each remote user/host. + * + * The SDK triggers this callback once every two seconds for each remote user. If a channel has + * multiple users/hosts sending video streams, the SDK triggers this callback as many times. + * + * @param connection The RtcConnection object. + * @param stats Statistics of the remote video stream. See + * RemoteVideoStats. + */ + virtual void onRemoteVideoStats(const RtcConnection& connection, const RemoteVideoStats& stats) { + (void)connection; + (void)stats; + } + + /** + * Occurs when the SDK cannot reconnect to the server 10 seconds after its connection to the server is + * interrupted. + * + * The SDK triggers this callback when it cannot connect to the server 10 seconds after calling + * `joinChannel`, regardless of whether it is in the channel or not. If the SDK fails to rejoin + * the channel 20 minutes after being disconnected from Agora's edge server, the SDK stops rejoining the channel. + * + * @param connection The RtcConnection object. + */ + virtual void onConnectionLost(const RtcConnection& connection) { + (void)connection; + } + + /** Occurs when the connection between the SDK and the server is interrupted. + * @deprecated Use `onConnectionStateChanged` instead. + + The SDK triggers this callback when it loses connection with the serer for more + than 4 seconds after the connection is established. After triggering this + callback, the SDK tries to reconnect to the server. If the reconnection fails + within a certain period (10 seconds by default), the onConnectionLost() + callback is triggered. If the SDK fails to rejoin the channel 20 minutes after + being disconnected from Agora's edge server, the SDK stops rejoining the channel. + + @param connection The RtcConnection object. + + */ + virtual void onConnectionInterrupted(const RtcConnection& connection) __deprecated { + (void)connection; + } + + /** Occurs when your connection is banned by the Agora Server. + * + * @param connection The RtcConnection object. + */ + virtual void onConnectionBanned(const RtcConnection& connection) { + (void)connection; + } + + /** Occurs when the local user receives the data stream from the remote user. + * + * The SDK triggers this callback when the user receives the data stream that another user sends + * by calling the \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage" method. + * + * @param connection The RtcConnection object. + * @param remoteUid ID of the user who sends the data stream. + * @param streamId The ID of the stream data. + * @param data The data stream. + * @param length The length (byte) of the data stream. + * @param sentTs The time when the data stream sent. + */ + virtual void onStreamMessage(const RtcConnection& connection, uid_t remoteUid, int streamId, const char* data, size_t length, uint64_t sentTs) { + (void)connection; + (void)remoteUid; + (void)streamId; + (void)data; + (void)length; + (void)sentTs; + } + + /** Occurs when the local user does not receive the data stream from the remote user. + * + * The SDK triggers this callback when the user fails to receive the data stream that another user sends + * by calling the \ref agora::rtc::IRtcEngine::sendStreamMessage "sendStreamMessage" method. + * + * @param connection The RtcConnection object. + * @param remoteUid ID of the user who sends the data stream. + * @param streamId The ID of the stream data. + * @param code The error code. + * @param missed The number of lost messages. + * @param cached The number of incoming cached messages when the data stream is + * interrupted. + */ + virtual void onStreamMessageError(const RtcConnection& connection, uid_t remoteUid, int streamId, int code, int missed, int cached) { + (void)connection; + (void)remoteUid; + (void)streamId; + (void)code; + (void)missed; + (void)cached; + } + + /** + * Occurs when the token expires. + * + * When the token expires during a call, the SDK triggers this callback to remind the app to renew the token. + * + * Upon receiving this callback, generate a new token at your app server and call + * `joinChannel` to pass the new token to the SDK. + * + * @param connection The RtcConnection object. + */ + virtual void onRequestToken(const RtcConnection& connection) { + (void)connection; + } + + /** + * Occurs when connection license verification fails. + * + * You can know the reason accordding to error code + */ + virtual void onLicenseValidationFailure(const RtcConnection& connection, LICENSE_ERROR_TYPE reason) { + (void)connection; + (void)reason; + } + + /** + * Occurs when the token will expire in 30 seconds. + * + * When the token is about to expire in 30 seconds, the SDK triggers this callback to remind the app to renew the token. + + * Upon receiving this callback, generate a new token at your app server and call + * \ref IRtcEngine::renewToken "renewToken" to pass the new Token to the SDK. + * + * @param connection The RtcConnection object. + * @param token The token that will expire in 30 seconds. + */ + virtual void onTokenPrivilegeWillExpire(const RtcConnection& connection, const char* token) { + (void)connection; + (void)token; + } + + /** Occurs when the first local audio frame is published. + * + * The SDK triggers this callback under one of the following circumstances: + * - The local client enables the audio module and calls `joinChannel` successfully. + * - The local client calls `muteLocalAudioStream(true)` and `muteLocalAudioStream(false)` in sequence. + * - The local client calls `disableAudio` and `enableAudio` in sequence. + * - The local client calls `pushAudioFrame` to successfully push the audio frame to the SDK. + * + * @param connection The RtcConnection object. + * @param elapsed The time elapsed (ms) from the local user calling `joinChannel` to the SDK triggers this callback. + */ + virtual void onFirstLocalAudioFramePublished(const RtcConnection& connection, int elapsed) { + (void)connection; + (void)elapsed; + } + + /** Occurs when the SDK receives the first audio frame from a specific remote user. + * @deprecated Use `onRemoteAudioStateChanged` instead. + * + * @param connection The RtcConnection object. + * @param userId ID of the remote user. + * @param elapsed The time elapsed (ms) from the loca user calling `joinChannel` + * until this callback is triggered. + */ + virtual void onFirstRemoteAudioFrame(const RtcConnection& connection, uid_t userId, int elapsed) __deprecated { + (void)connection; + (void)userId; + (void)elapsed; + } + + /** + * Occurs when the SDK decodes the first remote audio frame for playback. + * + * @deprecated Use `onRemoteAudioStateChanged` instead. + * The SDK triggers this callback under one of the following circumstances: + * - The remote user joins the channel and sends the audio stream for the first time. + * - The remote user's audio is offline and then goes online to re-send audio. It means the local user cannot + * receive audio in 15 seconds. Reasons for such an interruption include: + * - The remote user leaves channel. + * - The remote user drops offline. + * - The remote user calls muteLocalAudioStream to stop sending the audio stream. + * - The remote user calls disableAudio to disable audio. + * @param connection The RtcConnection object. + * @param uid User ID of the remote user sending the audio stream. + * @param elapsed The time elapsed (ms) from the loca user calling `joinChannel` + * until this callback is triggered. + */ + virtual void onFirstRemoteAudioDecoded(const RtcConnection& connection, uid_t uid, int elapsed) __deprecated { + (void)connection; + (void)uid; + (void)elapsed; + } + + /** Occurs when the local audio state changes. + * + * When the state of the local audio stream changes (including the state of the audio capture and encoding), the SDK + * triggers this callback to report the current state. This callback indicates the state of the local audio stream, + * and allows you to troubleshoot issues when audio exceptions occur. + * + * @note + * When the state is `LOCAL_AUDIO_STREAM_STATE_FAILED(3)`, see the `error` + * parameter for details. + * + * @param connection The RtcConnection object. + * @param state State of the local audio. See #LOCAL_AUDIO_STREAM_STATE. + * @param reason The reason information of the local audio. + * See #LOCAL_AUDIO_STREAM_REASON. + */ + virtual void onLocalAudioStateChanged(const RtcConnection& connection, LOCAL_AUDIO_STREAM_STATE state, LOCAL_AUDIO_STREAM_REASON reason) { + (void)connection; + (void)state; + (void)reason; + } + + /** Occurs when the remote audio state changes. + * + * When the audio state of a remote user (in the voice/video call channel) or host (in the live streaming channel) + * changes, the SDK triggers this callback to report the current state of the remote audio stream. + * + * @note This callback does not work properly when the number of users (in the voice/video call channel) or hosts + * (in the live streaming channel) in the channel exceeds 17. + * + * @param connection The RtcConnection object. + * @param remoteUid ID of the remote user whose audio state changes. + * @param state State of the remote audio. See #REMOTE_AUDIO_STATE. + * @param reason The reason of the remote audio state change. + * See #REMOTE_AUDIO_STATE_REASON. + * @param elapsed Time elapsed (ms) from the local user calling the + * `joinChannel` method until the SDK + * triggers this callback. + */ + virtual void onRemoteAudioStateChanged(const RtcConnection& connection, uid_t remoteUid, REMOTE_AUDIO_STATE state, REMOTE_AUDIO_STATE_REASON reason, int elapsed) { + (void)connection; + (void)remoteUid; + (void)state; + (void)reason; + (void)elapsed; + } + + /** + * Occurs when an active speaker is detected. + * + * After a successful call of `enableAudioVolumeIndication`, the SDK continuously detects which remote user has the + * loudest volume. During the current period, the remote user, who is detected as the loudest for the most times, + * is the most active user. + * + * When the number of users is no less than two and an active remote speaker exists, the SDK triggers this callback and reports the uid of the most active remote speaker. + * - If the most active remote speaker is always the same user, the SDK triggers the `onActiveSpeaker` callback only once. + * - If the most active remote speaker changes to another user, the SDK triggers this callback again and reports the uid of the new active remote speaker. + * + * @param connection The RtcConnection object. + * @param uid The ID of the active speaker. A `uid` of 0 means the local user. + */ + virtual void onActiveSpeaker(const RtcConnection& connection, uid_t uid) { + (void)connection; + (void)uid; + } + + /** + * Occurs when the user role switches in the interactive live streaming. + * + * @param connection The RtcConnection of the local user: #RtcConnection + * @param oldRole The old role of the user: #CLIENT_ROLE_TYPE + * @param newRole The new role of the user: #CLIENT_ROLE_TYPE + * @param newRoleOptions The client role options of the new role: #ClientRoleOptions. + */ + virtual void onClientRoleChanged(const RtcConnection& connection, CLIENT_ROLE_TYPE oldRole, CLIENT_ROLE_TYPE newRole, const ClientRoleOptions& newRoleOptions) { + (void)connection; + (void)oldRole; + (void)newRole; + (void)newRoleOptions; + } + + /** + * Occurs when the user role in a Live-Broadcast channel fails to switch, for example, from a broadcaster + * to an audience or vice versa. + * + * @param connection The RtcConnection object. + * @param reason The reason for failing to change the client role: #CLIENT_ROLE_CHANGE_FAILED_REASON. + * @param currentRole The current role of the user: #CLIENT_ROLE_TYPE. + */ + virtual void onClientRoleChangeFailed(const RtcConnection& connection, CLIENT_ROLE_CHANGE_FAILED_REASON reason, CLIENT_ROLE_TYPE currentRole) { + (void)connection; + (void)reason; + (void)currentRole; + } + + /** Reports the transport-layer statistics of each remote audio stream. + * @deprecated Use `onRemoteAudioStats` instead. + + This callback reports the transport-layer statistics, such as the packet loss rate and network time delay, once every + two seconds after the local user receives an audio packet from a remote user. During a call, when the user receives + the audio packet sent by the remote user/host, the callback is triggered every 2 seconds. + + @param connection The RtcConnection object. + @param remoteUid ID of the remote user whose audio data packet is received. + @param delay The network time delay (ms) from the sender to the receiver. + @param lost The Packet loss rate (%) of the audio packet sent from the remote + user. + @param rxKBitRate Received bitrate (Kbps) of the audio packet sent from the + remote user. + */ + virtual void onRemoteAudioTransportStats(const RtcConnection& connection, uid_t remoteUid, unsigned short delay, unsigned short lost, + unsigned short rxKBitRate) __deprecated { + (void)connection; + (void)remoteUid; + (void)delay; + (void)lost; + (void)rxKBitRate; + } + + /** Reports the transport-layer statistics of each remote video stream. + * @deprecated Use `onRemoteVideoStats` instead. + + This callback reports the transport-layer statistics, such as the packet loss rate and network time + delay, once every two seconds after the local user receives a video packet from a remote user. + + During a call, when the user receives the video packet sent by the remote user/host, the callback is + triggered every 2 seconds. + + @param connection The RtcConnection object. + @param remoteUid ID of the remote user whose video packet is received. + @param delay The network time delay (ms) from the remote user sending the + video packet to the local user. + @param lost The packet loss rate (%) of the video packet sent from the remote + user. + @param rxKBitRate The bitrate (Kbps) of the video packet sent from + the remote user. + */ + virtual void onRemoteVideoTransportStats(const RtcConnection& connection, uid_t remoteUid, unsigned short delay, unsigned short lost, + unsigned short rxKBitRate) __deprecated { + (void)connection; + (void)remoteUid; + (void)delay; + (void)lost; + (void)rxKBitRate; + } + + /** Occurs when the network connection state changes. + * + * When the network connection state changes, the SDK triggers this callback and reports the current + * connection state and the reason for the change. + * + * @param connection The RtcConnection object. + * @param state The current connection state. See #CONNECTION_STATE_TYPE. + * @param reason The reason for a connection state change. See #CONNECTION_CHANGED_REASON_TYPE. + */ + virtual void onConnectionStateChanged(const RtcConnection& connection, + CONNECTION_STATE_TYPE state, + CONNECTION_CHANGED_REASON_TYPE reason) { + (void)connection; + (void)state; + (void)reason; + } + + /** Occurs when the WIFI message need be sent to the user. + * + * @param connection The RtcConnection object. + * @param reason The reason of notifying the user of a message. + * @param action Suggest an action for the user. + * @param wlAccMsg The message content of notifying the user. + */ + virtual void onWlAccMessage(const RtcConnection& connection, WLACC_MESSAGE_REASON reason, WLACC_SUGGEST_ACTION action, const char* wlAccMsg) { + (void)connection; + (void)reason; + (void)action; + (void)wlAccMsg; + } + + /** Occurs when SDK statistics wifi acceleration optimization effect. + * + * @param connection The RtcConnection object. + * @param currentStats Instantaneous value of optimization effect. + * @param averageStats Average value of cumulative optimization effect. + */ + virtual void onWlAccStats(const RtcConnection& connection, WlAccStats currentStats, WlAccStats averageStats) { + (void)connection; + (void)currentStats; + (void)averageStats; + } + + /** Occurs when the local network type changes. + * + * This callback occurs when the connection state of the local user changes. You can get the + * connection state and reason for the state change in this callback. When the network connection + * is interrupted, this callback indicates whether the interruption is caused by a network type + * change or poor network conditions. + * @param connection The RtcConnection object. + * @param type The type of the local network connection. See #NETWORK_TYPE. + */ + virtual void onNetworkTypeChanged(const RtcConnection& connection, NETWORK_TYPE type) { + (void)connection; + (void)type; + } + + /** Reports the built-in encryption errors. + * + * When encryption is enabled by calling `enableEncryption`, the SDK triggers this callback if an + * error occurs in encryption or decryption on the sender or the receiver side. + * @param connection The RtcConnection object. + * @param errorType The error type. See #ENCRYPTION_ERROR_TYPE. + */ + virtual void onEncryptionError(const RtcConnection& connection, ENCRYPTION_ERROR_TYPE errorType) { + (void)connection; + (void)errorType; + } + /** + * Reports the user log upload result + * @param connection The RtcConnection object. + * @param requestId RequestId of the upload + * @param success Is upload success + * @param reason Reason of the upload, 0: OK, 1 Network Error, 2 Server Error. + */ + virtual void onUploadLogResult(const RtcConnection& connection, const char* requestId, bool success, UPLOAD_ERROR_REASON reason) { + (void)connection; + (void)requestId; + (void)success; + (void)reason; + } + + /** + * Occurs when the user account is updated. + * + * @param connection The RtcConnection object. + * @param remoteUid The user ID. + * @param userAccount The user account. + */ + virtual void onUserAccountUpdated(const RtcConnection& connection, uid_t remoteUid, const char* remoteUserAccount){ + (void)connection; + (void)remoteUid; + (void)remoteUserAccount; + } + + /** Reports the result of taking a video snapshot. + * + * After a successful `takeSnapshot` method call, the SDK triggers this callback to report whether the snapshot is + * successfully taken, as well as the details for that snapshot. + * @param connection The RtcConnection object. + * @param uid The user ID. A `uid` of 0 indicates the local user. + * @param filePath The local path of the snapshot. + * @param width The width (px) of the snapshot. + * @param height The height (px) of the snapshot. + * @param errCode The message that confirms success or gives the reason why the snapshot is not successfully taken: + * - 0: Success. + * - < 0: Failure. + * - -1: The SDK fails to write data to a file or encode a JPEG image. + * - -2: The SDK does not find the video stream of the specified user within one second after the `takeSnapshot` method call succeeds. + * - -3: Calling the `takeSnapshot` method too frequently. Call the `takeSnapshot` method after receiving the `onSnapshotTaken` + * callback from the previous call. + */ + virtual void onSnapshotTaken(const RtcConnection& connection, uid_t uid, const char* filePath, int width, int height, int errCode) { + (void)uid; + (void)filePath; + (void)width; + (void)height; + (void)errCode; + } + + /** + * Reports the tracing result of video rendering event of the user. + * + * @param connection The RtcConnection object. + * @param uid The user ID. + * @param currentEvent The current event of the tracing result: #MEDIA_TRACE_EVENT. + * @param tracingInfo The tracing result: #VideoRenderingTracingInfo. + */ + virtual void onVideoRenderingTracingResult(const RtcConnection& connection, uid_t uid, MEDIA_TRACE_EVENT currentEvent, VideoRenderingTracingInfo tracingInfo) { + (void)uid; + (void)currentEvent; + (void)tracingInfo; + } + + /** + * Occurs when receive use rtm response. + * + * @param connection The RtcConnection object. + * @param code The error code: + */ + virtual void onSetRtmFlagResult(const RtcConnection& connection, int code) { + (void)connection; + (void)code; + } + /** + * Occurs when receive a video transcoder stream which has video layout info. + * + * @param connection The RtcConnection object. + * @param uid user id of the transcoded stream. + * @param width width of the transcoded stream. + * @param height height of the transcoded stream. + * @param layoutCount count of layout info in the transcoded stream. + * @param layoutlist video layout info list of the transcoded stream. + */ + virtual void onTranscodedStreamLayoutInfo(const RtcConnection& connection, uid_t uid, int width, int height, int layoutCount,const VideoLayout* layoutlist) { + (void)uid; + (void)width; + (void)height; + (void)layoutCount; + (void)layoutlist; + } +}; + +class IRtcEngineEx : public IRtcEngine { +public: + /** + * Joins a channel with media options. + * + * This method enables users to join a channel. Users in the same channel can talk to each other, + * and multiple users in the same channel can start a group chat. Users with different App IDs + * cannot call each other. + * + * A successful call of this method triggers the following callbacks: + * - The local client: The `onJoinChannelSuccess` and `onConnectionStateChanged` callbacks. + * - The remote client: `onUserJoined`, if the user joining the channel is in the Communication + * profile or is a host in the Live-broadcasting profile. + * + * When the connection between the client and Agora's server is interrupted due to poor network + * conditions, the SDK tries reconnecting to the server. When the local client successfully rejoins + * the channel, the SDK triggers the `onRejoinChannelSuccess` callback on the local client. + * + * Compared to `joinChannel`, this method adds the options parameter to configure whether to + * automatically subscribe to all remote audio and video streams in the channel when the user + * joins the channel. By default, the user subscribes to the audio and video streams of all + * the other users in the channel, giving rise to usage and billings. To unsubscribe, set the + * `options` parameter or call the `mute` methods accordingly. + * + * @note + * - This method allows users to join only one channel at a time. + * - Ensure that the app ID you use to generate the token is the same app ID that you pass in the + * `initialize` method; otherwise, you may fail to join the channel by token. + * + * @param connection The RtcConnection object. + * @param token The token generated on your server for authentication. + * @param options The channel media options: ChannelMediaOptions. + * @param eventHandler The event handler: IRtcEngineEventHandler. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2: The parameter is invalid. For example, the token is invalid, the uid parameter is not set + * to an integer, or the value of a member in the `ChannelMediaOptions` structure is invalid. You need + * to pass in a valid parameter and join the channel again. + * - -3: Failes to initialize the `IRtcEngine` object. You need to reinitialize the IRtcEngine object. + * - -7: The IRtcEngine object has not been initialized. You need to initialize the IRtcEngine + * object before calling this method. + * - -8: The internal state of the IRtcEngine object is wrong. The typical cause is that you call + * this method to join the channel without calling `stopEchoTest` to stop the test after calling + * `startEchoTest` to start a call loop test. You need to call `stopEchoTest` before calling this method. + * - -17: The request to join the channel is rejected. The typical cause is that the user is in the + * channel. Agora recommends using the `onConnectionStateChanged` callback to get whether the user is + * in the channel. Do not call this method to join the channel unless you receive the + * `CONNECTION_STATE_DISCONNECTED(1)` state. + * - -102: The channel name is invalid. You need to pass in a valid channel name in channelId to + * rejoin the channel. + * - -121: The user ID is invalid. You need to pass in a valid user ID in uid to rejoin the channel. + */ + virtual int joinChannelEx(const char* token, const RtcConnection& connection, + const ChannelMediaOptions& options, + IRtcEngineEventHandler* eventHandler) = 0; + + /** + * Leaves the channel. + * + * This method allows a user to leave the channel, for example, by hanging up or exiting a call. + * + * This method is an asynchronous call, which means that the result of this method returns even before + * the user has not actually left the channel. Once the user successfully leaves the channel, the + * SDK triggers the \ref IRtcEngineEventHandler::onLeaveChannel "onLeaveChannel" callback. + * + * @param connection The RtcConnection object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int leaveChannelEx(const RtcConnection& connection) = 0; + + /** + * Leaves the channel with the connection ID. + * + * @param connection connection. + * @param options The options for leaving the channel. See #LeaveChannelOptions. + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int leaveChannelEx(const RtcConnection& connection, const LeaveChannelOptions& options) = 0; + + /** + * Updates the channel media options after joining the channel. + * + * @param options The channel media options: ChannelMediaOptions. + * @param connection The RtcConnection object. + * @return int + * - 0: Success. + * - < 0: Failure. + */ + virtual int updateChannelMediaOptionsEx(const ChannelMediaOptions& options, const RtcConnection& connection) = 0; + /** + * Sets the video encoder configuration. + * + * Each configuration profile corresponds to a set of video parameters, including + * the resolution, frame rate, and bitrate. + * + * The parameters specified in this method are the maximum values under ideal network conditions. + * If the video engine cannot render the video using the specified parameters due + * to poor network conditions, the parameters further down the list are considered + * until a successful configuration is found. + * + * @param config The local video encoder configuration: VideoEncoderConfiguration. + * @param connection The RtcConnection object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setVideoEncoderConfigurationEx(const VideoEncoderConfiguration& config, const RtcConnection& connection) = 0; + /** + * Initializes the video view of a remote user. + * + * This method initializes the video view of a remote stream on the local device. It affects only the + * video view that the local user sees. + * + * Usually the app should specify the `uid` of the remote video in the method call before the + * remote user joins the channel. If the remote `uid` is unknown to the app, set it later when the + * app receives the \ref IRtcEngineEventHandler::onUserJoined "onUserJoined" callback. + * + * To unbind the remote user from the view, set `view` in VideoCanvas as `null`. + * + * @note + * Ensure that you call this method in the UI thread. + * + * @param canvas The remote video view settings: VideoCanvas. + * @param connection The RtcConnection object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setupRemoteVideoEx(const VideoCanvas& canvas, const RtcConnection& connection) = 0; + /** + * Stops or resumes receiving the audio stream of a specified user. + * + * @note + * You can call this method before or after joining a channel. If a user + * leaves a channel, the settings in this method become invalid. + * + * @param uid The ID of the specified user. + * @param mute Whether to stop receiving the audio stream of the specified user: + * - true: Stop receiving the audio stream of the specified user. + * - false: (Default) Resume receiving the audio stream of the specified user. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int muteRemoteAudioStreamEx(uid_t uid, bool mute, const RtcConnection& connection) = 0; + /** + * Stops or resumes receiving the video stream of a specified user. + * + * @note + * You can call this method before or after joining a channel. If a user + * leaves a channel, the settings in this method become invalid. + * + * @param uid The ID of the specified user. + * @param mute Whether to stop receiving the video stream of the specified user: + * - true: Stop receiving the video stream of the specified user. + * - false: (Default) Resume receiving the video stream of the specified user. + * @param connection The RtcConnetion object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int muteRemoteVideoStreamEx(uid_t uid, bool mute, const RtcConnection& connection) = 0; + /** + * Sets the remote video stream type. + * + * If the remote user has enabled the dual-stream mode, by default the SDK receives the high-stream video by + * Call this method to switch to the low-stream video. + * + * @note + * This method applies to scenarios where the remote user has enabled the dual-stream mode using + * \ref enableDualStreamMode "enableDualStreamMode"(true) before joining the channel. + * + * @param uid ID of the remote user sending the video stream. + * @param streamType Sets the video stream type: #VIDEO_STREAM_TYPE. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteVideoStreamTypeEx(uid_t uid, VIDEO_STREAM_TYPE streamType, const RtcConnection& connection) = 0; + /** + *Stops or resumes sending the local audio stream with connection. + * + *@param mute Determines whether to send or stop sending the local audio stream: + *- true: Stop sending the local audio stream. + *- false: Send the local audio stream. + * + *@param connection The connection of the user ID. + * + *@return + *- 0: Success. + *- < 0: Failure. + */ + virtual int muteLocalAudioStreamEx(bool mute, const RtcConnection& connection) = 0; + + /** + *Stops or resumes sending the local video stream with connection. + * + *@param mute Determines whether to send or stop sending the local video stream: + *- true: Stop sending the local video stream. + *- false: Send the local video stream. + * + *@param connection The connection of the user ID. + * + *@return + *- 0: Success. + *- < 0: Failure. + */ + virtual int muteLocalVideoStreamEx(bool mute, const RtcConnection& connection) = 0; + + /** + *Stops or resumes receiving all remote audio stream with connection. + * + *@param mute Whether to stop receiving remote audio streams: + *- true: Stop receiving any remote audio stream. + *- false: Resume receiving all remote audio streams. + * + *@param connection The connection of the user ID. + * + *@return + *- 0: Success. + *- < 0: Failure. + */ + virtual int muteAllRemoteAudioStreamsEx(bool mute, const RtcConnection& connection) = 0; + + /** + *Stops or resumes receiving all remote video stream with connection. + * + *@param mute Whether to stop receiving remote audio streams: + *- true: Stop receiving any remote audio stream. + *- false: Resume receiving all remote audio streams. + * + *@param connection The connection of the user ID. + * + *@return + *- 0: Success. + *- < 0: Failure. + */ + virtual int muteAllRemoteVideoStreamsEx(bool mute, const RtcConnection& connection) = 0; + + + /** + * Sets the blocklist of subscribe remote stream audio. + * + * @note + * If uid is in uidList, the remote user's audio will not be subscribed, + * even if muteRemoteAudioStream(uid, false) and muteAllRemoteAudioStreams(false) are operated. + * + * @param uidList The id list of users who do not subscribe to audio. + * @param uidNumber The number of uid in uidList. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeAudioBlocklistEx(uid_t* uidList, int uidNumber, const RtcConnection& connection) = 0; + + /** + * Sets the allowlist of subscribe remote stream audio. + * + * @note + * - If uid is in uidList, the remote user's audio will be subscribed, + * even if muteRemoteAudioStream(uid, true) and muteAllRemoteAudioStreams(true) are operated. + * - If a user is in the blacklist and whitelist at the same time, the user will not subscribe to audio. + * + * @param uidList The id list of users who do subscribe to audio. + * @param uidNumber The number of uid in uidList. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeAudioAllowlistEx(uid_t* uidList, int uidNumber, const RtcConnection& connection) = 0; + + /** + * Sets the blocklist of subscribe remote stream video. + * + * @note + * If uid is in uidList, the remote user's video will not be subscribed, + * even if muteRemoteVideoStream(uid, false) and muteAllRemoteVideoStreams(false) are operated. + * + * @param uidList The id list of users who do not subscribe to video. + * @param uidNumber The number of uid in uidList. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeVideoBlocklistEx(uid_t* uidList, int uidNumber, const RtcConnection& connection) = 0; + + /** + * Sets the allowlist of subscribe remote stream video. + * + * @note + * - If uid is in uidList, the remote user's video will be subscribed, + * even if muteRemoteVideoStream(uid, true) and muteAllRemoteVideoStreams(true) are operated. + * - If a user is in the blacklist and whitelist at the same time, the user will not subscribe to video. + * + * @param uidList The id list of users who do subscribe to video. + * @param uidNumber The number of uid in uidList. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setSubscribeVideoAllowlistEx(uid_t* uidList, int uidNumber, const RtcConnection& connection) = 0; + /** + * Sets the remote video subscription options + * + * + * @param uid ID of the remote user sending the video stream. + * @param options Sets the video subscription options: VideoSubscriptionOptions. + * @param connection The RtcConnection object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteVideoSubscriptionOptionsEx(uid_t uid, const VideoSubscriptionOptions& options, const RtcConnection& connection) = 0; + /** Sets the sound position and gain of a remote user. + + When the local user calls this method to set the sound position of a remote user, the sound difference between the left and right channels allows the local user to track the real-time position of the remote user, creating a real sense of space. This method applies to massively multiplayer online games, such as Battle Royale games. + + @note + - For this method to work, enable stereo panning for remote users by calling the \ref agora::rtc::IRtcEngine::enableSoundPositionIndication "enableSoundPositionIndication" method before joining a channel. + - This method requires hardware support. For the best sound positioning, we recommend using a wired headset. + - Ensure that you call this method after joining a channel. + + @param uid The ID of the remote user. + @param pan The sound position of the remote user. The value ranges from -1.0 to 1.0: + - 0.0: the remote sound comes from the front. + - -1.0: the remote sound comes from the left. + - 1.0: the remote sound comes from the right. + @param gain Gain of the remote user. The value ranges from 0.0 to 100.0. The default value is 100.0 (the original gain of the remote user). The smaller the value, the less the gain. + @param connection The RtcConnection object. + + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setRemoteVoicePositionEx(uid_t uid, double pan, double gain, const RtcConnection& connection) = 0; + /** Sets remote user parameters for spatial audio + + @param uid The ID of the remote user. + @param param Spatial audio parameters. See SpatialAudioParams. + @param connection The RtcConnection object. + + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int setRemoteUserSpatialAudioParamsEx(uid_t uid, const agora::SpatialAudioParams& params, const RtcConnection& connection) = 0; + /** + * Updates the display mode of the video view of a remote user. + * + * After initializing the video view of a remote user, you can call this method to update its + * rendering and mirror modes. This method affects only the video view that the local user sees. + * + * @note + * - Ensure that you have called \ref setupRemoteVideo "setupRemoteVideo" to initialize the remote video + * view before calling this method. + * - During a call, you can call this method as many times as necessary to update the display mode + * of the video view of a remote user. + * + * @param uid ID of the remote user. + * @param renderMode Sets the remote display mode. See #RENDER_MODE_TYPE. + * @param mirrorMode Sets the mirror type. See #VIDEO_MIRROR_MODE_TYPE. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRemoteRenderModeEx(uid_t uid, media::base::RENDER_MODE_TYPE renderMode, + VIDEO_MIRROR_MODE_TYPE mirrorMode, const RtcConnection& connection) = 0; + /** Enables loopback recording. + * + * If you enable loopback recording, the output of the default sound card is mixed into + * the audio stream sent to the other end. + * + * @note This method is for Windows only. + * + * @param connection The RtcConnection object. + * @param enabled Sets whether to enable/disable loopback recording. + * - true: Enable loopback recording. + * - false: (Default) Disable loopback recording. + * @param deviceName Pointer to the device name of the sound card. The default value is NULL (the default sound card). + * - This method is for macOS and Windows only. + * - macOS does not support loopback capturing of the default sound card. If you need to use this method, + * please use a virtual sound card and pass its name to the deviceName parameter. Agora has tested and recommends using soundflower. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableLoopbackRecordingEx(const RtcConnection& connection, bool enabled, const char* deviceName = NULL) = 0; + + /** + * Adjusts the recording volume. + * + * @param volume The recording volume, which ranges from 0 to 400: + * - 0 : Mute the recording volume. + * - 100: The original volume. + * - 400: (Maximum) Four times the original volume with signal clipping protection. + * + * @param connection The RtcConnection object. + * + * @return + * - 0 : Success. + * - < 0: Failure. + */ + virtual int adjustRecordingSignalVolumeEx(int volume, const RtcConnection& connection) = 0; + + /** + * Mute or resume recording signal volume. + * + * @param mute Determines whether to mute or resume the recording signal volume. + * - true: Mute the recording signal volume. + * - false: (Default) Resume the recording signal volume. + * + * @param connection The RtcConnection object. + * + * @return + * - 0 : Success. + * - < 0: Failure. + */ + virtual int muteRecordingSignalEx(bool mute, const RtcConnection& connection) = 0; + + /** + * Adjust the playback signal volume of a specified remote user. + * You can call this method as many times as necessary to adjust the playback volume of different remote users, or to repeatedly adjust the playback volume of the same remote user. + * + * @note + * The playback volume here refers to the mixed volume of a specified remote user. + * This method can only adjust the playback volume of one specified remote user at a time. To adjust the playback volume of different remote users, call the method as many times, once for each remote user. + * + * @param uid The ID of the remote user. + * @param volume The playback volume of the specified remote user. The value ranges between 0 and 400, including the following: + * + * - 0: Mute. + * - 100: (Default) Original volume. + * @param connection RtcConnection + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int adjustUserPlaybackSignalVolumeEx(uid_t uid, int volume, const RtcConnection& connection) = 0; + + /** Gets the current connection state of the SDK. + @param connection The RtcConnection object. + @return #CONNECTION_STATE_TYPE. + */ + virtual CONNECTION_STATE_TYPE getConnectionStateEx(const RtcConnection& connection) = 0; + /** Enables/Disables the built-in encryption. + * + * In scenarios requiring high security, Agora recommends calling this method to enable the built-in encryption before joining a channel. + * + * All users in the same channel must use the same encryption mode and encryption key. Once all users leave the channel, the encryption key of this channel is automatically cleared. + * + * @note + * - If you enable the built-in encryption, you cannot use the RTMP streaming function. + * + * @param connection The RtcConnection object. + * @param enabled Whether to enable the built-in encryption: + * - true: Enable the built-in encryption. + * - false: Disable the built-in encryption. + * @param config Configurations of built-in encryption schemas. See EncryptionConfig. + * + * @return + * - 0: Success. + * - < 0: Failure. + * - -2(ERR_INVALID_ARGUMENT): An invalid parameter is used. Set the parameter with a valid value. + * - -4(ERR_NOT_SUPPORTED): The encryption mode is incorrect or the SDK fails to load the external encryption library. Check the enumeration or reload the external encryption library. + * - -7(ERR_NOT_INITIALIZED): The SDK is not initialized. Initialize the `IRtcEngine` instance before calling this method. + */ + virtual int enableEncryptionEx(const RtcConnection& connection, bool enabled, const EncryptionConfig& config) = 0; + /** Creates a data stream. + * + * You can call this method to create a data stream and improve the + * reliability and ordering of data tranmission. + * + * @note + * - Ensure that you set the same value for `reliable` and `ordered`. + * - Each user can only create a maximum of 5 data streams during a RtcEngine + * lifecycle. + * - The data channel allows a data delay of up to 5 seconds. If the receiver + * does not receive the data stream within 5 seconds, the data channel reports + * an error. + * + * @param[out] streamId The ID of the stream data. + * @param reliable Sets whether the recipients are guaranteed to receive + * the data stream from the sender within five seconds: + * - true: The recipients receive the data stream from the sender within + * five seconds. If the recipient does not receive the data stream within + * five seconds, an error is reported to the application. + * - false: There is no guarantee that the recipients receive the data stream + * within five seconds and no error message is reported for any delay or + * missing data stream. + * @param ordered Sets whether the recipients receive the data stream + * in the sent order: + * - true: The recipients receive the data stream in the sent order. + * - false: The recipients do not receive the data stream in the sent order. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int createDataStreamEx(int* streamId, bool reliable, bool ordered, const RtcConnection& connection) = 0; + /** Creates a data stream. + * + * Each user can create up to five data streams during the lifecycle of the IChannel. + * @param streamId The ID of the created data stream. + * @param config The config of data stream. + * @param connection The RtcConnection object. + * @return int + * - Returns 0: Success. + * - < 0: Failure. + */ + virtual int createDataStreamEx(int* streamId, const DataStreamConfig& config, const RtcConnection& connection) = 0; + /** Sends a data stream. + * + * After calling \ref IRtcEngine::createDataStream "createDataStream", you can call + * this method to send a data stream to all users in the channel. + * + * The SDK has the following restrictions on this method: + * - Up to 60 packets can be sent per second in a channel with each packet having a maximum size of 1 KB. + * - Each client can send up to 30 KB of data per second. + * - Each user can have up to five data streams simultaneously. + * + * After the remote user receives the data stream within 5 seconds, the SDK triggers the + * \ref IRtcEngineEventHandler::onStreamMessage "onStreamMessage" callback on + * the remote client. After the remote user does not receive the data stream within 5 seconds, + * the SDK triggers the \ref IRtcEngineEventHandler::onStreamMessageError "onStreamMessageError" + * callback on the remote client. + * + * @note + * - Call this method after calling \ref IRtcEngine::createDataStream "createDataStream". + * - This method applies only to the `COMMUNICATION` profile or to + * the hosts in the `LIVE_BROADCASTING` profile. If an audience in the + * `LIVE_BROADCASTING` profile calls this method, the audience may be switched to a host. + * + * @param streamId The ID of the stream data. + * @param data The data stream. + * @param length The length (byte) of the data stream. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int sendStreamMessageEx(int streamId, const char* data, size_t length, const RtcConnection& connection) = 0; + /** Adds a watermark image to the local video. + + This method adds a PNG watermark image to the local video in a live broadcast. Once the watermark image is added, all the audience in the channel (CDN audience included), + and the recording device can see and capture it. Agora supports adding only one watermark image onto the local video, and the newly watermark image replaces the previous one. + + The watermark position depends on the settings in the \ref IRtcEngine::setVideoEncoderConfiguration "setVideoEncoderConfiguration" method: + - If the orientation mode of the encoding video is #ORIENTATION_MODE_FIXED_LANDSCAPE, or the landscape mode in #ORIENTATION_MODE_ADAPTIVE, the watermark uses the landscape orientation. + - If the orientation mode of the encoding video is #ORIENTATION_MODE_FIXED_PORTRAIT, or the portrait mode in #ORIENTATION_MODE_ADAPTIVE, the watermark uses the portrait orientation. + - When setting the watermark position, the region must be less than the dimensions set in the `setVideoEncoderConfiguration` method. Otherwise, the watermark image will be cropped. + + @note + - Ensure that you have called the \ref agora::rtc::IRtcEngine::enableVideo "enableVideo" method to enable the video module before calling this method. + - If you only want to add a watermark image to the local video for the audience in the CDN live broadcast channel to see and capture, you can call this method or the \ref agora::rtc::IRtcEngine::setLiveTranscoding "setLiveTranscoding" method. + - This method supports adding a watermark image in the PNG file format only. Supported pixel formats of the PNG image are RGBA, RGB, Palette, Gray, and Alpha_gray. + - If the dimensions of the PNG image differ from your settings in this method, the image will be cropped or zoomed to conform to your settings. + - If you have enabled the local video preview by calling the \ref agora::rtc::IRtcEngine::startPreview "startPreview" method, you can use the `visibleInPreview` member in the WatermarkOptions class to set whether or not the watermark is visible in preview. + - If you have enabled the mirror mode for the local video, the watermark on the local video is also mirrored. To avoid mirroring the watermark, Agora recommends that you do not use the mirror and watermark functions for the local video at the same time. You can implement the watermark function in your application layer. + + @param watermarkUrl The local file path of the watermark image to be added. This method supports adding a watermark image from the local absolute or relative file path. + @param options Pointer to the watermark's options to be added. See WatermarkOptions for more infomation. + @param connection The RtcConnection object. + + @return int + - 0: Success. + - < 0: Failure. + */ + virtual int addVideoWatermarkEx(const char* watermarkUrl, const WatermarkOptions& options, const RtcConnection& connection) = 0; + /** Removes the watermark image on the video stream added by + addVideoWatermark(). + + @param connection The RtcConnection object. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int clearVideoWatermarkEx(const RtcConnection& connection) = 0; + /** Agora supports reporting and analyzing customized messages. + * + * This function is in the beta stage with a free trial. The ability provided + * in its beta test version is reporting a maximum of 10 message pieces within + * 6 seconds, with each message piece not exceeding 256 bytes. + * + * To try out this function, contact [support@agora.io](mailto:support@agora.io) + * and discuss the format of customized messages with us. + */ + virtual int sendCustomReportMessageEx(const char* id, const char* category, const char* event, const char* label, + int value, const RtcConnection& connection) = 0; + + /** + * Enables the `onAudioVolumeIndication` callback to report on which users are speaking + * and the speakers' volume. + * + * Once the \ref IRtcEngineEventHandler::onAudioVolumeIndication "onAudioVolumeIndication" + * callback is enabled, the SDK returns the volume indication in the at the time interval set + * in `enableAudioVolumeIndication`, regardless of whether any user is speaking in the channel. + * + * @param interval Sets the time interval between two consecutive volume indications: + * - <= 0: Disables the volume indication. + * - > 0: Time interval (ms) between two consecutive volume indications, + * and should be integral multiple of 200 (less than 200 will be set to 200). + * @param smooth The smoothing factor that sets the sensitivity of the audio volume + * indicator. The value range is [0, 10]. The greater the value, the more sensitive the + * indicator. The recommended value is 3. + * @param reportVad + * - `true`: Enable the voice activity detection of the local user. Once it is enabled, the `vad` parameter of the + * `onAudioVolumeIndication` callback reports the voice activity status of the local user. + * - `false`: (Default) Disable the voice activity detection of the local user. Once it is disabled, the `vad` parameter + * of the `onAudioVolumeIndication` callback does not report the voice activity status of the local user, except for + * the scenario where the engine automatically detects the voice activity of the local user. + * @param connection The RtcConnection object. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int enableAudioVolumeIndicationEx(int interval, int smooth, bool reportVad, const RtcConnection& connection) = 0; + + /** Publishes the local stream without transcoding to a specified CDN live RTMP address. (CDN live only.) + * + * @param url The CDN streaming URL in the RTMP format. The maximum length of this parameter is 1024 bytes. + * @param connection RtcConnection. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startRtmpStreamWithoutTranscodingEx(const char* url, const RtcConnection& connection) = 0; + + /** Publishes the local stream with transcoding to a specified CDN live RTMP address. (CDN live only.) + * + * @param url The CDN streaming URL in the RTMP format. The maximum length of this parameter is 1024 bytes. + * @param transcoding Sets the CDN live audio/video transcoding settings. See LiveTranscoding. + * @param connection RtcConnection. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startRtmpStreamWithTranscodingEx(const char* url, const LiveTranscoding& transcoding, const RtcConnection& connection) = 0; + + /** Update the video layout and audio settings for CDN live. (CDN live only.) + * @note This method applies to Live Broadcast only. + * + * @param transcoding Sets the CDN live audio/video transcoding settings. See LiveTranscoding. + * @param connection RtcConnection. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int updateRtmpTranscodingEx(const LiveTranscoding& transcoding, const RtcConnection& connection) = 0; + + /** Stop an RTMP stream with transcoding or without transcoding from the CDN. (CDN live only.) + * @param url The RTMP URL address to be removed. The maximum length of this parameter is 1024 bytes. + * @param connection RtcConnection. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopRtmpStreamEx(const char* url, const RtcConnection& connection) = 0; + + /** Starts relaying media streams across channels or updates the channels for media relay. + * + * @since v4.2.0 + * @param configuration The configuration of the media stream relay:ChannelMediaRelayConfiguration. + * @param connection RtcConnection. + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -8(ERR_INVALID_STATE): The current status is invalid, only allowed to be called when the role is the broadcaster. + */ + virtual int startOrUpdateChannelMediaRelayEx(const ChannelMediaRelayConfiguration& configuration, const RtcConnection& connection) = 0; + + /** Stops the media stream relay. + * + * Once the relay stops, the host quits all the destination + * channels. + * + * @param connection RtcConnection. + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -7(ERR_NOT_INITIALIZED): cross channel media streams are not relayed. + */ + virtual int stopChannelMediaRelayEx(const RtcConnection& connection) = 0; + + /** pause the channels for media stream relay. + * + * @param connection RtcConnection. + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -7(ERR_NOT_INITIALIZED): cross channel media streams are not relayed. + */ + virtual int pauseAllChannelMediaRelayEx(const RtcConnection& connection) = 0; + + /** resume the channels for media stream relay. + * + * @param connection RtcConnection. + * @return + * - 0: Success. + * - < 0: Failure. + * - -1(ERR_FAILED): A general error occurs (no specified reason). + * - -2(ERR_INVALID_ARGUMENT): The argument is invalid. + * - -5(ERR_REFUSED): The request is rejected. + * - -7(ERR_NOT_INITIALIZED): cross channel media streams are not relayed. + */ + virtual int resumeAllChannelMediaRelayEx(const RtcConnection& connection) = 0; + + /** Gets the user information by passing in the user account. + * It is same as agora::rtc::IRtcEngine::getUserInfoByUserAccount. + * + * @param userAccount The user account of the user. Ensure that you set this parameter. + * @param [in,out] userInfo A userInfo object that identifies the user: + * - Input: A userInfo object. + * - Output: A userInfo object that contains the user account and user ID of the user. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getUserInfoByUserAccountEx(const char* userAccount, rtc::UserInfo* userInfo, const RtcConnection& connection) = 0; + + /** Gets the user information by passing in the user ID. + * It is same as agora::rtc::IRtcEngine::getUserInfoByUid. + * + * @param uid The user ID of the remote user. Ensure that you set this parameter. + * @param[in,out] userInfo A userInfo object that identifies the user: + * - Input: A userInfo object. + * - Output: A userInfo object that contains the user account and user ID of the user. + * @param connection The RtcConnection object. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getUserInfoByUidEx(uid_t uid, rtc::UserInfo* userInfo, const RtcConnection& connection) = 0; + + /** + * Enables or disables the dual video stream mode. + * + * @deprecated v4.2.0. This method is deprecated. Use setDualStreamModeEx instead + * + * If dual-stream mode is enabled, the subscriber can choose to receive the high-stream + * (high-resolution high-bitrate video stream) or low-stream (low-resolution low-bitrate video + * stream) video using {@link setRemoteVideoStreamType setRemoteVideoStreamType}. + * + * @param enabled + * - true: Enable the dual-stream mode. + * - false: (default) Disable the dual-stream mode. + * @param streamConfig The minor stream config + * @param connection The RtcConnection object. + */ + virtual int enableDualStreamModeEx(bool enabled, const SimulcastStreamConfig& streamConfig, + const RtcConnection& connection) = 0; + /** + * Enables, disables or auto enable the dual video stream mode. + * + * If dual-stream mode is enabled, the subscriber can choose to receive the high-stream + * (high-resolution high-bitrate video stream) or low-stream (low-resolution low-bitrate video + * stream) video using {@link setRemoteVideoStreamType setRemoteVideoStreamType}. + * + * @param mode The dual stream mode: #SIMULCAST_STREAM_MODE. + * @param streamConfig The configuration of the low stream: SimulcastStreamConfig. + * @param connection The RtcConnection object. + */ + virtual int setDualStreamModeEx(SIMULCAST_STREAM_MODE mode, + const SimulcastStreamConfig& streamConfig, + const RtcConnection& connection) = 0; + + /** + * Set the high priority user list and their fallback level in weak network condition. + * + * @note + * - This method can be called before and after joining a channel. + * - If a subscriber is set to high priority, this stream only fallback to lower stream after all normal priority users fallback to their fallback level on weak network condition if needed. + * + * @param uidList The high priority user list. + * @param uidNum The size of uidList. + * @param option The fallback level of high priority users. + * @param connection An output parameter which is used to control different connection instances. + * + * @return int + * - 0 : Success. + * - <0 : Failure. + */ + virtual int setHighPriorityUserListEx(uid_t* uidList, int uidNum, + STREAM_FALLBACK_OPTIONS option, + const RtcConnection& connection) = 0; + + /** + * Takes a snapshot of a video stream. + * + * This method takes a snapshot of a video stream from the specified user, generates a JPG + * image, and saves it to the specified path. + * + * The method is asynchronous, and the SDK has not taken the snapshot when the method call + * returns. After a successful method call, the SDK triggers the `onSnapshotTaken` callback + * to report whether the snapshot is successfully taken, as well as the details for that + * snapshot. + * + * @note + * - Call this method after joining a channel. + * - This method takes a snapshot of the published video stream specified in `ChannelMediaOptions`. + * - If the user's video has been preprocessed, for example, watermarked or beautified, the resulting + * snapshot includes the pre-processing effect. + * @param connection The RtcConnection object. + * @param uid The user ID. Set uid as 0 if you want to take a snapshot of the local user's video. + * @param filePath The local path (including filename extensions) of the snapshot. For example: + * - Windows: `C:\Users\\AppData\Local\Agora\\example.jpg` + * - iOS: `/App Sandbox/Library/Caches/example.jpg` + * - macOS: `锝/Library/Logs/example.jpg` + * - Android: `/storage/emulated/0/Android/data//files/example.jpg` + * + * Ensure that the path you specify exists and is writable. + * @return + * - 0 : Success. + * - < 0 : Failure. + */ + virtual int takeSnapshotEx(const RtcConnection& connection, uid_t uid, const char* filePath) = 0; + + /** Enables video screenshot and upload with the connection ID. + @param enabled Whether to enable video screenshot and upload: + - `true`: Yes. + - `false`: No. + @param config The configuration for video screenshot and upload. + @param connection The connection information. See RtcConnection. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int enableContentInspectEx(bool enabled, const media::ContentInspectConfig &config, const RtcConnection& connection) = 0; + + /** + @brief Start tracing media rendering events. + @since v4.1.1 + @discussion + - SDK will trace media rendering events when this API is called. + - The tracing result can be obtained through callback `IRtcEngineEventHandler(Ex)::onVideoRenderingTracingResult` + @param connection The RtcConnection object. + @note + - By default, SDK will trace media rendering events when `IRtcEngineEx::joinChannelEx` is called. + - The start point of event tracing will be reset after leaving channel. + @return + - 0: Success. + - < 0: Failure. + - -2(ERR_INVALID_ARGUMENT): The parameter is invalid. Check the channel ID and local uid set by parameter `connection`. + - -7(ERR_NOT_INITIALIZED): The SDK is not initialized. Initialize the `IRtcEngine` instance before calling this method. + */ + virtual int startMediaRenderingTracingEx(const RtcConnection& connection) = 0; + + /** Provides the technical preview functionalities or special customizations by configuring the SDK with JSON options. + @since v4.3.0 + @param connection The connection information. See RtcConnection. + @param parameters Pointer to the set parameters in a JSON string. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int setParametersEx(const RtcConnection& connection, const char* parameters) = 0; +}; + +} // namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraSpatialAudio.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraSpatialAudio.h new file mode 100644 index 000000000..f4a3ba6a3 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAgoraSpatialAudio.h @@ -0,0 +1,302 @@ +// +// AgoraRtcEngine SDK +// +// Copyright (c) 2019 Agora.io. All rights reserved. +// + +#pragma once + +#include +#include "AgoraBase.h" +#include "AgoraMediaBase.h" +#include "AgoraRefPtr.h" +#include "IAgoraRtcEngineEx.h" + +namespace agora { +namespace rtc { + +// The information of remote voice position +struct RemoteVoicePositionInfo { + // The coordnate of remote voice source, (x, y, z) + float position[3]; + // The forward vector of remote voice, (x, y, z). When it's not set, the vector is forward to listner. + float forward[3]; +}; + +struct SpatialAudioZone { + //the zone id + int zoneSetId; + //zone center point + float position[3]; + //forward direction + float forward[3]; + //right direction + float right[3]; + //up direction + float up[3]; + //the forward side length of the zone + float forwardLength; + //tehe right side length of the zone + float rightLength; + //the up side length of the zone + float upLength; + //the audio attenuation of zone + float audioAttenuation; +}; + +/** The definition of LocalSpatialAudioConfig + */ +struct LocalSpatialAudioConfig { + /*The reference to \ref IRtcEngine, which is the base interface class of the Agora RTC SDK and provides + * the real-time audio and video communication functionality. + */ + agora::rtc::IRtcEngine* rtcEngine; + + LocalSpatialAudioConfig() : rtcEngine(NULL) {} +}; + +/** The IBaseSpatialAudioEngine class provides the common methods of ICloudSpatialAudioEngine and ILocalSpatialAudioEngine. + */ +class ILocalSpatialAudioEngine: public RefCountInterface { + protected: + virtual ~ILocalSpatialAudioEngine() {} + + public: + /** + * Releases all the resources occupied by spatial audio engine. + */ + virtual void release() = 0; + + /** + * Initializes the ILocalSpatialAudioEngine object and allocates the internal resources. + * + * @note Ensure that you call IRtcEngine::queryInterface and initialize before calling any other ILocalSpatialAudioEngine APIs. + * + * @param config The pointer to the LocalSpatialAudioConfig. See #LocalSpatialAudioConfig. + * + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int initialize(const LocalSpatialAudioConfig& config) = 0; + /** + * Updates the position information of remote user. You should call it when remote user whose role is broadcaster moves. + * + * @param uid The remote user ID. It should be the same as RTC channel remote user id. + * @param posInfo The position information of remote user. See #RemoteVoicePositionInfo. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int updateRemotePosition(uid_t uid, const RemoteVoicePositionInfo &posInfo) = 0; + /** + * Updates the position of remote user. It's supposed to use with IRtcEngineEx::joinChannelEx. + * + * @param uid The remote user ID. It should be the same as RTC channel remote user id. + * @param posInfo The position information of remote user. See #RemoteVoicePositionInfo. + * @param connection The RTC connection whose spatial audio effect you want to update. See #RtcConnection. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int updateRemotePositionEx(uid_t uid, const RemoteVoicePositionInfo &posInfo, const RtcConnection& connection) = 0; + /** + * Remove the position information of remote user. You should call it when remote user called IRtcEngine::leaveChannel. + * + * @param uid The remote user ID. It should be the same as RTC channel remote user id. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int removeRemotePosition(uid_t uid) = 0; + /** + * Remove the position information of remote user. It's supposed to use with IRtcEngineEx::joinChannelEx. + * + * @param uid The remote user ID. It should be the same as RTC channel remote user id. + * @param connection The RTC connection whose spatial audio effect you want to update. See #RtcConnection. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int removeRemotePositionEx(uid_t uid, const RtcConnection& connection) = 0; + /** + * Clear the position informations of remote users. It's supposed to use with IRtcEngineEx::joinChannelEx. + * + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int clearRemotePositionsEx(const RtcConnection& connection) = 0; + /** + * Updates the position of local user. This method is used in scene with multi RtcConnection. + * + * @note + * - This method is only effective in ILocalSpatialAudioEngine currently. + * + * @param position The sound position of the user. The coordinate order is forward, right, and up. + * @param axisForward The vector in the direction of the forward axis in the coordinate system. + * @param axisRight The vector in the direction of the right axis in the coordinate system. + * @param axisUp The vector in the direction of the up axis in the coordinate system. + * @param connection The RTC connection whose spatial audio effect you want to update. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int updateSelfPositionEx(const float position[3], const float axisForward[3], const float axisRight[3], const float axisUp[3], const RtcConnection& connection) = 0; + + /** + * This method sets the maximum number of streams that a player can receive in a + * specified audio reception range. + * + * @note You can call this method either before or after calling enterRoom: + * - Calling this method before enterRoom affects the maximum number of received streams + * the next time the player enters a room. + * - Calling this method after entering a room changes the current maximum number of + * received streams of the player. + * + * @param maxCount The maximum number of streams that a player can receive within + * a specified audio reception range. If the number of receivable streams exceeds + * the set value, the SDK receives the set number of streams closest to the player. + * + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int setMaxAudioRecvCount(int maxCount) = 0; + /** + * This method sets the audio reception range. The unit of the audio reception range + * is the same as the unit of distance in the game engine. + * + * @note You can call this method either before or after calling enterRoom. + * During the game, you can call it multiple times to update the audio reception range. + * + * @param range The maximum audio reception range, in the unit of game engine distance. + * + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int setAudioRecvRange(float range) = 0; + + /** + * This method sets distance unit of game engine. The smaller the unit is, the sound fades slower + * with distance. + * + * @note You can call this method either before or after calling enterRoom. + * During the game, you can call it multiple times to update the distance unit. + * + * @param unit The number of meters that the game engine distance per unit is equal to. For example, setting unit as 2 means the game engine distance per unit equals 2 meters. + * + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int setDistanceUnit(float unit) = 0; + /** + * Updates the position of local user. + * When calling it in ICloudSpatialAudioEngine, it triggers the SDK to update the user position to the Agora spatial audio server. The Agora spatial audio server uses the users' world coordinates and audio reception range to determine whether they are within each other's specified audio reception range. + * When calling it in ILocalSpatialAudioEngine, it triggers the SDK to calculate the relative position between the local and remote users and updates spatial audio parameters. + * + * when calling it in ICloudSpatialAudioEngine, you should notice: + * @note + * - Call the method after calling enterRoom. + * - The call frequency is determined by the app. Agora recommends calling this method every + * 120 to 7000 ms. Otherwise, the SDK may lose synchronization with the server. + * + * @param position The sound position of the user. The coordinate order is forward, right, and up. + * @param axisForward The vector in the direction of the forward axis in the coordinate system. + * @param axisRight The vector in the direction of the right axis in the coordinate system. + * @param axisUp The vector in the direction of the up axis in the coordinate system. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int updateSelfPosition(const float position[3], const float axisForward[3], const float axisRight[3], const float axisUp[3]) = 0; + /** + * Updates the position of a media player in scene. This method has same behavior both in ICloudSpatialAudioEngine and ILocalSpatialAudioEngine. + * + * @note + * - This method is suggested to be called once if you don't move media player in the virtual space. + * + * @param playerId The ID of the media player. You can get it by IMediaPlayer::getMediaPlayerId. + * @param positionInfo The position information of media player in the virtual space. For details inforamtion, see the declaration of RemoteVoicePositionInfo. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int updatePlayerPositionInfo(int playerId, const RemoteVoicePositionInfo& positionInfo) = 0; + + /** + * Set parameters for spatial audio engine. It's deprecated for using. + * + * @param params The parameter string. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int setParameters(const char* params) = 0; + + /** + * Mute or unmute local audio stream. + * + * @param mute When it's false, it will send local audio stream, otherwise it will not send local audio stream. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int muteLocalAudioStream(bool mute) = 0; + /** + * Mute all remote audio streams. It determines wether SDK receves remote audio streams or not. + * + * @param mute When it's false, SDK will receive remote audio streams, otherwise SDK will not receive remote audio streams. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int muteAllRemoteAudioStreams(bool mute) = 0; + + /** + * Mute or unmute remote user audio stream. + * + * @param uid The ID of the remote user. + * @param mute When it's false, SDK will receive remote user audio streams, otherwise SDK will not receive remote user audio streams. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int muteRemoteAudioStream(uid_t uid, bool mute) = 0; + + virtual int setRemoteAudioAttenuation(uid_t uid, double attenuation, bool forceSet) = 0; + + /** + * Setting up sound Space + * + * @param zones The Sound space array + * @param zoneCount the sound Space count of array + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int setZones(const SpatialAudioZone *zones, unsigned int zoneCount) = 0; + /** + * Set the audio attenuation coefficient of the player + * @param playerId The ID of the media player. You can get it by IMediaPlayer::getMediaPlayerId. + * @param attenuation The audio attenuation of the media player. + * @param forceSet Whether to force the setting of audio attenuation coefficient. + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int setPlayerAttenuation(int playerId, double attenuation, bool forceSet) = 0; + /** + * Clear the position informations of remote users. + * + * @return + * - 0: Success. + * - <0: Failure. + */ + virtual int clearRemotePositions() = 0; +}; + +} // namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAudioDeviceManager.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAudioDeviceManager.h new file mode 100644 index 000000000..77667b827 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/IAudioDeviceManager.h @@ -0,0 +1,482 @@ +// +// Agora SDK +// +// Copyright (c) 2021 Agora.io. All rights reserved. +// +#pragma once // NOLINT(build/header_guard) + +namespace agora { +namespace rtc { + +/** + * The maximum device ID length. + */ +enum MAX_DEVICE_ID_LENGTH_TYPE { + /** + * The maximum device ID length is 512. + */ + MAX_DEVICE_ID_LENGTH = 512 +}; + +/** + * The IAudioDeviceCollection class. + */ +class IAudioDeviceCollection { +public: + virtual ~IAudioDeviceCollection() {} + + /** + * Gets the total number of the playback or recording devices. + * + * Call \ref IAudioDeviceManager::enumeratePlaybackDevices + * "enumeratePlaybackDevices" first, and then call this method to return the + * number of the audio playback devices. + * + * @return + * - The number of the audio devices, if the method call succeeds. + * - < 0, if the method call fails. + */ + virtual int getCount() = 0; + + /** + * Gets the information of a specified audio device. + * @param index An input parameter that specifies the audio device. + * @param deviceName An output parameter that indicates the device name. + * @param deviceId An output parameter that indicates the device ID. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getDevice(int index, char deviceName[MAX_DEVICE_ID_LENGTH], + char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Specifies a device with the device ID. + * @param deviceId The device ID. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setDevice(const char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets the default audio device of the system (for macOS and Windows only). + * + * @param deviceName The name of the system default audio device. + * @param deviceId The device ID of the the system default audio device. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getDefaultDevice(char deviceName[MAX_DEVICE_ID_LENGTH], char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Sets the volume of the app. + * + * @param volume The volume of the app. The value range is [0, 255]. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setApplicationVolume(int volume) = 0; + + /** + * Gets the volume of the app. + * + * @param volume The volume of the app. The value range is [0, 255] + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getApplicationVolume(int &volume) = 0; + + /** Mutes or unmutes the app. + * + * @param mute Determines whether to mute the app: + * - true: Mute the app. + * - false: Unmute the app. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setApplicationMute(bool mute) = 0; + + /** + * Gets the mute state of the app. + * + * @param mute A reference to the mute state of the app: + * - true: The app is muted. + * - false: The app is not muted. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int isApplicationMute(bool &mute) = 0; + + /** + * Releases all IAudioDeviceCollection resources. + */ + virtual void release() = 0; +}; + +/** + * The IAudioDeviceManager class. + */ +class IAudioDeviceManager : public RefCountInterface { +public: + virtual ~IAudioDeviceManager() {} + + /** + * Enumerates the audio playback devices. + * + * This method returns an IAudioDeviceCollection object that includes all the + * audio playback devices in the system. With the IAudioDeviceCollection + * object, the app can enumerate the audio playback devices. The app must call + * the \ref IAudioDeviceCollection::release "IAudioDeviceCollection::release" + * method to release the returned object after using it. + * + * @return + * - A pointer to the IAudioDeviceCollection object that includes all the + * audio playback devices in the system, if the method call succeeds. + * - The empty pointer NULL, if the method call fails. + */ + virtual IAudioDeviceCollection *enumeratePlaybackDevices() = 0; + + /** + * Enumerates the audio recording devices. + * + * This method returns an IAudioDeviceCollection object that includes all the + * audio recording devices in the system. With the IAudioDeviceCollection + * object, the app can enumerate the audio recording devices. The app needs to + * call the \ref IAudioDeviceCollection::release + * "IAudioDeviceCollection::release" method to release the returned object + * after using it. + * + * @return + * - A pointer to the IAudioDeviceCollection object that includes all the + * audio recording devices in the system, if the method call succeeds. + * - The empty pointer NULL, if the method call fails. + */ + virtual IAudioDeviceCollection *enumerateRecordingDevices() = 0; + + /** + * Specifies an audio playback device with the device ID. + * + * @param deviceId ID of the audio playback device. It can be retrieved by the + * \ref enumeratePlaybackDevices "enumeratePlaybackDevices" method. Plugging + * or unplugging the audio device does not change the device ID. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlaybackDevice(const char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets the ID of the audio playback device. + * @param deviceId An output parameter that specifies the ID of the audio + * playback device. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getPlaybackDevice(char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets the device ID and device name of the audio playback device. + * @param deviceId An output parameter that specifies the ID of the audio + * playback device. + * @param deviceName An output parameter that specifies the name of the audio + * playback device. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getPlaybackDeviceInfo(char deviceId[MAX_DEVICE_ID_LENGTH], + char deviceName[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Sets the volume of the audio playback device. + * @param volume The volume of the audio playing device. The value range is + * [0, 255]. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlaybackDeviceVolume(int volume) = 0; + + /** + * Gets the volume of the audio playback device. + * @param volume The volume of the audio playback device. The value range is + * [0, 255]. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getPlaybackDeviceVolume(int *volume) = 0; + + /** + * Specifies an audio recording device with the device ID. + * + * @param deviceId ID of the audio recording device. It can be retrieved by + * the \ref enumerateRecordingDevices "enumerateRecordingDevices" method. + * Plugging or unplugging the audio device does not change the device ID. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRecordingDevice(const char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets the audio recording device by the device ID. + * + * @param deviceId ID of the audio recording device. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getRecordingDevice(char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets the information of the audio recording device by the device ID and + * device name. + * + * @param deviceId ID of the audio recording device. + * @param deviceName The name of the audio recording device. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getRecordingDeviceInfo(char deviceId[MAX_DEVICE_ID_LENGTH], + char deviceName[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Sets the volume of the recording device. + * @param volume The volume of the recording device. The value range is [0, + * 255]. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRecordingDeviceVolume(int volume) = 0; + + /** + * Gets the volume of the recording device. + * @param volume The volume of the microphone, ranging from 0 to 255. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getRecordingDeviceVolume(int *volume) = 0; + + /** + * Specifies an audio loopback recording device with the device ID. + * + * @param deviceId ID of the audio loopback recording device. It can be retrieved by + * the \ref enumeratePlaybackDevices "enumeratePlaybackDevices" method. + * Plugging or unplugging the audio device does not change the device ID. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setLoopbackDevice(const char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Gets the audio loopback recording device by the device ID. + * + * @param deviceId ID of the audio loopback recording device. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getLoopbackDevice(char deviceId[MAX_DEVICE_ID_LENGTH]) = 0; + + /** + * Mutes or unmutes the audio playback device. + * + * @param mute Determines whether to mute the audio playback device. + * - true: Mute the device. + * - false: Unmute the device. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setPlaybackDeviceMute(bool mute) = 0; + + /** + * Gets the mute state of the playback device. + * + * @param mute A pointer to the mute state of the playback device. + * - true: The playback device is muted. + * - false: The playback device is unmuted. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getPlaybackDeviceMute(bool *mute) = 0; + + /** + * Mutes or unmutes the audio recording device. + * + * @param mute Determines whether to mute the recording device. + * - true: Mute the microphone. + * - false: Unmute the microphone. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int setRecordingDeviceMute(bool mute) = 0; + + /** + * Gets the mute state of the audio recording device. + * + * @param mute A pointer to the mute state of the recording device. + * - true: The microphone is muted. + * - false: The microphone is unmuted. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int getRecordingDeviceMute(bool *mute) = 0; + + /** + * Starts the audio playback device test. + * + * This method tests if the playback device works properly. In the test, the + * SDK plays an audio file specified by the user. If the user hears the audio, + * the playback device works properly. + * + * @param testAudioFilePath The file path of the audio file for the test, + * which is an absolute path in UTF8: + * - Supported file format: wav, mp3, m4a, and aac. + * - Supported file sampling rate: 8000, 16000, 32000, 44100, and 48000. + * + * @return + * - 0, if the method call succeeds and you can hear the sound of the + * specified audio file. + * - An error code, if the method call fails. + */ + virtual int startPlaybackDeviceTest(const char *testAudioFilePath) = 0; + + /** + * Stops the audio playback device test. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopPlaybackDeviceTest() = 0; + + /** + * Starts the recording device test. + * + * This method tests whether the recording device works properly. Once the + * test starts, the SDK uses the \ref + * IRtcEngineEventHandler::onAudioVolumeIndication "onAudioVolumeIndication" + * callback to notify the app on the volume information. + * + * @param indicationInterval The time interval (ms) between which the SDK + * triggers the `onAudioVolumeIndication` callback. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startRecordingDeviceTest(int indicationInterval) = 0; + + /** + * Stops the recording device test. + * + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopRecordingDeviceTest() = 0; + + /** + * Starts the audio device loopback test. + * + * This method tests whether the local audio devices are working properly. + * After calling this method, the microphone captures the local audio and + * plays it through the speaker, and the \ref + * IRtcEngineEventHandler::onAudioVolumeIndication "onAudioVolumeIndication" + * callback returns the local audio volume information at the set interval. + * + * @note This method tests the local audio devices and does not report the + * network conditions. + * @param indicationInterval The time interval (ms) at which the \ref + * IRtcEngineEventHandler::onAudioVolumeIndication "onAudioVolumeIndication" + * callback returns. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int startAudioDeviceLoopbackTest(int indicationInterval) = 0; + + /** + * Stops the audio device loopback test. + * + * @note Ensure that you call this method to stop the loopback test after + * calling the \ref IAudioDeviceManager::startAudioDeviceLoopbackTest + * "startAudioDeviceLoopbackTest" method. + * @return + * - 0: Success. + * - < 0: Failure. + */ + virtual int stopAudioDeviceLoopbackTest() = 0; + + /** The status of following system default playback device. + + @note The status of following system default playback device. + + @param enable Variable to whether the current device follow system default playback device or not. + - true: The current device will change when the system default playback device changed. + - false: The current device will change only current device is removed. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int followSystemPlaybackDevice(bool enable) = 0; + + /** The status of following system default recording device. + + @note The status of following system default recording device. + + @param enable Variable to whether the current device follow system default recording device or not. + - true: The current device will change when the system default recording device changed. + - false: The current device will change only current device is removed. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int followSystemRecordingDevice(bool enable) = 0; + + /** The status of following system default loopback device. + + @note The status of following system default loopback device. + + @param enable Variable to whether the current device follow system default loopback device or not. + - true: The current device will change when the system default loopback device changed. + - false: The current device will change only current device is removed. + @return + - 0: Success. + - < 0: Failure. + */ + virtual int followSystemLoopbackDevice(bool enable) = 0; + + /** + * Releases all IAudioDeviceManager resources. + */ + virtual void release() = 0; +}; + +} // namespace rtc +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/time_utils.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/time_utils.h new file mode 100644 index 000000000..bb4759b71 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/agora/time_utils.h @@ -0,0 +1,85 @@ +// +// Agora Media SDK +// +// Copyright (c) 2021 Agora IO. All rights reserved. +// + +#pragma once +#include + +namespace agora { +namespace base { + +class NtpTime { + public: + static const uint64_t ntpFracPerSecond = 4294967296; + + NtpTime() : ms_(0) {} + + NtpTime(uint64_t ms) : ms_(ms) {} + + NtpTime(uint32_t seconds, uint32_t fractions) { + const double fracMs = fractions * 1000.0 / static_cast(ntpFracPerSecond); + ms_ = static_cast(seconds) * 1000 + static_cast(0.5 + fracMs); + } + + operator uint64_t() const { return ms_; } + + /** Gets the NTP time. + * + * @return + * - The wallclock time which is in milliseconds relative to 0h UTC on 1 January 1900. + */ + uint64_t Ms() const { + return ms_; + } + + /** Check that whether the NtpTime object is valid + * + * - `true`: the NtpTime object is valid. + * - `false`: the NtpTime object is invalid. + */ + bool Valid() const { return ms_ != 0; } + + /** Gets the integer part of the NTP timestamp. + * + * @return + * - An uint32_t value. + */ + uint32_t ToSeconds() const { + return static_cast(ms_ / 1000); + } + + /** Gets the fractional part of the NTP timestamp. + * + * @return + * - An uint32_t value. + */ + uint32_t ToFractions() const { + return static_cast((ms_ % 1000) * static_cast(ntpFracPerSecond) / 1000.0); + } + + /** Gets the NTP timestamp. + * + * @note + * - The full resolution NTP timestamp is a 64-bit unsigned fixed-point number with the integer part in the first 32 bits and the fractional part in the last 32 bits. + * + * @return + * - An uint64_t value. + */ + uint64_t ToTimestamp() const { + return ToSeconds() * ntpFracPerSecond + ToFractions(); + } + + private: + uint64_t ms_; +}; + +inline bool operator==(const NtpTime& n1, const NtpTime& n2) { + return static_cast(n1) == static_cast(n2); +} + +inline bool operator!=(const NtpTime& n1, const NtpTime& n2) { return !(n1 == n2); } + +} // namespace base +} // namespace agora diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/packet_processing_plugin_jni.h b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/packet_processing_plugin_jni.h new file mode 100644 index 000000000..17235f849 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/include/packet_processing_plugin_jni.h @@ -0,0 +1,12 @@ +#ifndef ADVANCED_VIDEO_ANDROID_PACKET_PROCESSING_PLUGIN_JNI_H +#define ADVANCED_VIDEO_ANDROID_PACKET_PROCESSING_PLUGIN_JNI_H + +#include + +extern "C" JNIEXPORT void JNICALL +Java_io_agora_api_streamencrypt_PacketProcessor_doRegisterProcessing(JNIEnv *env, jclass clazz, jlong rtcEngineHandler); + +extern "C" JNIEXPORT void JNICALL +Java_io_agora_api_streamencrypt_PacketProcessor_doUnregisterProcessing(JNIEnv *env, jclass clazz, jlong rtcEngineHandler); + +#endif //ADVANCED_VIDEO_ANDROID_PACKET_PROCESSING_PLUGIN_JNI_H \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/cpp/packet_processing_plugin_jni.cpp b/Android/APIExample/agora-stream-encrypt/src/main/cpp/packet_processing_plugin_jni.cpp new file mode 100644 index 000000000..fdc3f2e23 --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/cpp/packet_processing_plugin_jni.cpp @@ -0,0 +1,174 @@ +#include +#include +#include +#include +#include +#include + +#include +#include + +#include "./include/agora/IAgoraMediaEngine.h" +#include "./include/agora/IAgoraRtcEngine.h" + +#include "./include/packet_processing_plugin_jni.h" + +/**stream data frame listener*/ +class AgoraRTCPacketObserver : public agora::rtc::IPacketObserver { + public: + AgoraRTCPacketObserver() { + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "AgoraRTCPacketObserver0"); + m_txAudioBuffer.resize(2048); + m_rxAudioBuffer.resize(2048); + m_txVideoBuffer.resize(2048); + m_rxVideoBuffer.resize(2048); + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "AgoraRTCPacketObserver1"); + } + + /**Occurs when the local user sends an audio packet. + * @param packet The sent audio packet. + * @return + * true: The audio packet is sent successfully. + * false: The audio packet is discarded.*/ + virtual bool onSendAudioPacket(Packet &packet) { + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "onSendAudioPacket0"); + int i; + //encrypt the packet + const unsigned char *p = packet.buffer; + const unsigned char *pe = packet.buffer + packet.size; + + for (i = 0; p < pe && i < m_txAudioBuffer.size(); ++p, ++i) { + m_txAudioBuffer[i] = *p ^ 0x55; + } + //assign new buffer and the length back to SDK + packet.buffer = &m_txAudioBuffer[0]; + packet.size = i; + return true; + } + + /**Occurs when the local user sends a video packet. + * @param packet The sent video packet. + * @return + * true: The video packet is sent successfully. + * false: The video packet is discarded.*/ + virtual bool onSendVideoPacket(Packet &packet) { + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "onSendAudioPacket1%d", 1); + int i; + //encrypt the packet + const unsigned char *p = packet.buffer; + const unsigned char *pe = packet.buffer + packet.size; + for (i = 0; p < pe && i < m_txVideoBuffer.size(); ++p, ++i) { + m_txVideoBuffer[i] = *p ^ 0x55; + } + //assign new buffer and the length back to SDK + packet.buffer = &m_txVideoBuffer[0]; + packet.size = i; + return true; + } + + /**Occurs when the local user receives an audio packet. + * @param packet The received audio packet. + * @return + * true: The audio packet is received successfully. + * false: The audio packet is discarded.*/ + virtual bool onReceiveAudioPacket(Packet &packet) { + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "onReceiveAudioPacket0"); + int i = 0; + //decrypt the packet + const unsigned char *p = packet.buffer; + const unsigned char *pe = packet.buffer + packet.size; + for (i = 0; p < pe && i < m_rxAudioBuffer.size(); ++p, ++i) { + m_rxAudioBuffer[i] = *p ^ 0x55; + } + //assign new buffer and the length back to SDK + packet.buffer = &m_rxAudioBuffer[0]; + packet.size = i; + return true; + } + + /**Occurs when the local user receives a video packet. + * @param packet The received video packet. + * @return + * true: The video packet is received successfully. + * false: The video packet is discarded.*/ + virtual bool onReceiveVideoPacket(Packet &packet) { + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "onReceiveAudioPacket1"); + int i = 0; + //decrypt the packet + const unsigned char *p = packet.buffer; + const unsigned char *pe = packet.buffer + packet.size; + + for (i = 0; p < pe && i < m_rxVideoBuffer.size(); ++p, ++i) { + m_rxVideoBuffer[i] = *p ^ 0x55; + } + //assign new buffer and the length back to SDK + packet.buffer = &m_rxVideoBuffer[0]; + packet.size = i; + return true; + } + + private: + std::vector m_txAudioBuffer; //buffer for sending audio data + std::vector m_txVideoBuffer; //buffer for sending video data + + std::vector m_rxAudioBuffer; //buffer for receiving audio data + std::vector m_rxVideoBuffer; //buffer for receiving video data +}; + +static AgoraRTCPacketObserver s_packetObserver; + +#ifdef __cplusplus +extern "C" { +#endif + +JNIEXPORT jint JNI_OnLoad(JavaVM *vm, void *reserved) { + JNIEnv *env = NULL; + jint result = -1; + + if (vm->GetEnv((void **) &env, JNI_VERSION_1_4) != JNI_OK) { + return result; + } + + assert(env != NULL); + result = JNI_VERSION_1_6; + return result; +} + +JNIEXPORT void JNICALL +Java_io_agora_api_streamencrypt_PacketProcessor_doRegisterProcessing + (JNIEnv *env, jclass clazz, jlong rtcEngineHandler) { + + agora::rtc::IRtcEngine *rtcEngine = reinterpret_cast(rtcEngineHandler); + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "doRegisterProcessing0"); + if (!rtcEngine) return; + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "doRegisterProcessing1"); +/**Registers a packet observer. + * The Agora SDK allows your application to register a packet observer to receive callbacks for + * voice or video packet transmission. + * @param obsrver Pointer to the registered packet observer. + * @return + * 0: Success. + * < 0: Failure. + * PS: + * The size of the packet sent to the network after processing should not exceed 1200 bytes, + * otherwise, the packet may fail to be sent. + * Ensure that both receivers and senders call this method, otherwise, you may meet undefined + * behaviors such as no voice and black screen. + * When you use CDN live streaming, recording or storage functions, Agora doesn't recommend + * calling this method.*/ + int code = rtcEngine->registerPacketObserver(&s_packetObserver); + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "%d", code); +} + +JNIEXPORT void JNICALL +Java_io_agora_api_streamencrypt_PacketProcessor_doUnregisterProcessing + (JNIEnv *env, jclass clazz, jlong rtcEngineHandler) { + agora::rtc::IRtcEngine *rtcEngine = reinterpret_cast(rtcEngineHandler); + if (!rtcEngine) return; + __android_log_print(ANDROID_LOG_INFO, "agoraencryption", "doUnregisterProcessing"); + rtcEngine->registerPacketObserver(nullptr); +} + +#ifdef __cplusplus +} +#endif \ No newline at end of file diff --git a/Android/APIExample/agora-stream-encrypt/src/main/java/io/agora/api/streamencrypt/PacketProcessor.java b/Android/APIExample/agora-stream-encrypt/src/main/java/io/agora/api/streamencrypt/PacketProcessor.java new file mode 100644 index 000000000..ab97bbbee --- /dev/null +++ b/Android/APIExample/agora-stream-encrypt/src/main/java/io/agora/api/streamencrypt/PacketProcessor.java @@ -0,0 +1,18 @@ +package io.agora.api.streamencrypt; + +public class PacketProcessor { + static { + System.loadLibrary("agora-stream-encrypt"); + } + + public static void registerProcessing(long rtcEngineHandler) { + doRegisterProcessing(rtcEngineHandler); + } + + public static void unregisterProcessing(long rtcEngineHandler) { + doUnregisterProcessing(rtcEngineHandler); + } + + private native static void doRegisterProcessing(long rtcEngineHandler); + private native static void doUnregisterProcessing(long rtcEngineHandler); +} diff --git a/Android/APIExample/app/build.gradle b/Android/APIExample/app/build.gradle index 5a38f0484..bec40f6a2 100644 --- a/Android/APIExample/app/build.gradle +++ b/Android/APIExample/app/build.gradle @@ -1,5 +1,6 @@ apply plugin: 'com.android.application' apply plugin: 'kotlin-android' +apply from: "${rootDir.absolutePath}/git-hooks.gradle" def localSdkPath= "${rootProject.projectDir.absolutePath}/../../sdk" @@ -62,7 +63,7 @@ dependencies { implementation fileTree(dir: "${localSdkPath}", include: ['*.jar', '*.aar']) } else{ - def agora_sdk_version = "4.2.6" + def agora_sdk_version = "4.3.0" // case 1: full libs implementation "io.agora.rtc:full-sdk:${agora_sdk_version}" implementation "io.agora.rtc:full-screen-sharing:${agora_sdk_version}" @@ -96,6 +97,9 @@ dependencies { if (simpleFilter.toBoolean()) { implementation project(path: ':agora-simple-filter') } + if (streamEncrypt.toBoolean()) { + implementation project(path: ':agora-stream-encrypt') + } testImplementation 'junit:junit:4.12' androidTestImplementation 'androidx.test.ext:junit:1.1.3' androidTestImplementation 'androidx.test.espresso:espresso-core:3.4.0' diff --git a/Android/APIExample/app/src/main/AndroidManifest.xml b/Android/APIExample/app/src/main/AndroidManifest.xml index 9a95d9d52..35df3018e 100644 --- a/Android/APIExample/app/src/main/AndroidManifest.xml +++ b/Android/APIExample/app/src/main/AndroidManifest.xml @@ -53,6 +53,9 @@ android:windowSoftInputMode="stateAlwaysHidden" android:screenOrientation="portrait" /> + + \ No newline at end of file diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/MainApplication.java b/Android/APIExample/app/src/main/java/io/agora/api/example/MainApplication.java index 0a057c895..83aef9962 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/MainApplication.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/MainApplication.java @@ -10,6 +10,9 @@ import io.agora.api.example.common.model.GlobalSettings; import io.agora.api.example.utils.ClassUtils; +/** + * The type Main application. + */ public class MainApplication extends Application { private GlobalSettings globalSettings; @@ -34,14 +37,18 @@ private void initExamples() { } } Examples.sortItem(); - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); } } + /** + * Gets global settings. + * + * @return the global settings + */ public GlobalSettings getGlobalSettings() { - if(globalSettings == null){ + if (globalSettings == null) { globalSettings = new GlobalSettings(); } return globalSettings; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/MainFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/MainFragment.java index 1be00cf4c..54aebb2ec 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/MainFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/MainFragment.java @@ -31,9 +31,9 @@ * interface. */ public class MainFragment extends Fragment { - // TODO: Customize parameter argument names + // Customize parameter argument names private static final String ARG_COLUMN_COUNT = "column-count"; - // TODO: Customize parameters + // Customize parameters private int mColumnCount = 1; private OnListFragmentInteractionListener mListener; @@ -44,7 +44,12 @@ public class MainFragment extends Fragment { public MainFragment() { } - // TODO: Customize parameter initialization + /** + * New instance main fragment. + * + * @param columnCount the column count + * @return the main fragment + */ @SuppressWarnings("unused") public static MainFragment newInstance(int columnCount) { MainFragment fragment = new MainFragment(); @@ -114,7 +119,11 @@ public void onDetach() { * >Communicating with Other Fragments for more information. */ public interface OnListFragmentInteractionListener { - // TODO: Update argument type and name + /** + * Update argument type and name. + * + * @param item the item + */ void onListFragmentInteraction(Example item); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/ReadyFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/ReadyFragment.java index 8256cda9b..7ed7dc652 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/ReadyFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/ReadyFragment.java @@ -54,7 +54,7 @@ public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup c public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); ActionBar actionBar = ((AppCompatActivity) getActivity()).getSupportActionBar(); - if(actionBar != null){ + if (actionBar != null) { actionBar.setTitle(exampleBean.getName()); actionBar.setHomeButtonEnabled(true); actionBar.setDisplayHomeAsUpEnabled(true); @@ -106,8 +106,7 @@ private void runOnPermissionGranted(@NonNull Runnable runnable) { // Request permission AndPermission.with(this).runtime().permission( permissionArray - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted runnable.run(); }).start(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/SettingActivity.java b/Android/APIExample/app/src/main/java/io/agora/api/example/SettingActivity.java index 62f6d13f9..50167a161 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/SettingActivity.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/SettingActivity.java @@ -17,7 +17,7 @@ /** * @author cjw */ -public class SettingActivity extends AppCompatActivity{ +public class SettingActivity extends AppCompatActivity { private static final String TAG = SettingActivity.class.getSimpleName(); private ActivitySettingLayoutBinding mBinding; @@ -31,24 +31,24 @@ protected void onCreate(@Nullable Bundle savedInstanceState) { mBinding.sdkVersion.setText(String.format(getString(R.string.sdkversion1), RtcEngine.getSdkVersion())); String[] mItems = getResources().getStringArray(R.array.orientations); String[] labels = new String[mItems.length]; - for(int i = 0;i arrayAdapter =new ArrayAdapter(this,android.R.layout.simple_spinner_dropdown_item, labels); + ArrayAdapter arrayAdapter = new ArrayAdapter(this, android.R.layout.simple_spinner_dropdown_item, labels); mBinding.orientationSpinner.setAdapter(arrayAdapter); fetchGlobalSettings(); } - private void fetchGlobalSettings(){ - GlobalSettings globalSettings = ((MainApplication)getApplication()).getGlobalSettings(); + private void fetchGlobalSettings() { + GlobalSettings globalSettings = ((MainApplication) getApplication()).getGlobalSettings(); String[] mItems = getResources().getStringArray(R.array.orientations); String selectedItem = globalSettings.getVideoEncodingOrientation(); int i = 0; - if(selectedItem!=null){ - for(String item : mItems){ - if(selectedItem.equals(item)){ + if (selectedItem != null) { + for (String item : mItems) { + if (selectedItem.equals(item)) { break; } i++; @@ -58,9 +58,9 @@ private void fetchGlobalSettings(){ mItems = getResources().getStringArray(R.array.fps); selectedItem = globalSettings.getVideoEncodingFrameRate(); i = 0; - if(selectedItem!=null){ - for(String item : mItems){ - if(selectedItem.equals(item)){ + if (selectedItem != null) { + for (String item : mItems) { + if (selectedItem.equals(item)) { break; } i++; @@ -70,9 +70,9 @@ private void fetchGlobalSettings(){ mItems = getResources().getStringArray(R.array.dimensions); selectedItem = globalSettings.getVideoEncodingDimension(); i = 0; - if(selectedItem!=null){ - for(String item : mItems){ - if(selectedItem.equals(item)){ + if (selectedItem != null) { + for (String item : mItems) { + if (selectedItem.equals(item)) { break; } i++; @@ -82,9 +82,9 @@ private void fetchGlobalSettings(){ mItems = getResources().getStringArray(R.array.areaCode); selectedItem = globalSettings.getAreaCodeStr(); i = 0; - if(selectedItem!=null){ - for(String item : mItems){ - if(selectedItem.equals(item)){ + if (selectedItem != null) { + for (String item : mItems) { + if (selectedItem.equals(item)) { break; } i++; @@ -111,7 +111,7 @@ public boolean onCreateOptionsMenu(@NonNull Menu menu) { @Override public boolean onOptionsItemSelected(MenuItem item) { if (item.getItemId() == saveMenu.getItemId()) { - GlobalSettings globalSettings = ((MainApplication)getApplication()).getGlobalSettings(); + GlobalSettings globalSettings = ((MainApplication) getApplication()).getGlobalSettings(); globalSettings.privateCloudIp = mBinding.privateCloudLayout.etIpAddress.getText().toString(); globalSettings.privateCloudLogReportEnable = mBinding.privateCloudLayout.swLogReport.isChecked(); globalSettings.privateCloudLogServerDomain = mBinding.privateCloudLayout.etLogServerDomain.getText().toString(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/annotation/Example.java b/Android/APIExample/app/src/main/java/io/agora/api/example/annotation/Example.java index bea15b542..1e790bb1d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/annotation/Example.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/annotation/Example.java @@ -1,8 +1,5 @@ package io.agora.api.example.annotation; -import android.os.Parcelable; - -import java.io.Serializable; import java.lang.annotation.ElementType; import java.lang.annotation.Retention; import java.lang.annotation.RetentionPolicy; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseBrowserFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseBrowserFragment.java index 99bc4f736..ee3457e1e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseBrowserFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseBrowserFragment.java @@ -38,6 +38,9 @@ import io.agora.api.example.R; import io.agora.api.example.databinding.FragmentBaseBrowserBinding; +/** + * The type Base browser fragment. + */ public abstract class BaseBrowserFragment extends BaseFragment { private FragmentBaseBrowserBinding mBinding; @@ -122,7 +125,7 @@ public void handleOnBackPressed() { // 涓嶄娇鐢ㄧ紦瀛橈紝鍙粠缃戠粶鑾峰彇鏁版嵁 // LOAD_CACHE_ONLY: // 涓嶄娇鐢ㄧ綉缁滐紝鍙鍙栨湰鍦扮紦瀛樻暟鎹 - webSettings.setCacheMode(WebSettings.LOAD_NO_CACHE);// 璁剧疆缂撳瓨妯″紡 + webSettings.setCacheMode(WebSettings.LOAD_NO_CACHE); // 璁剧疆缂撳瓨妯″紡 // js 鐩稿叧 @@ -153,7 +156,7 @@ public void handleOnBackPressed() { // 鏀寔鍚屾椂鎵撳紑https鍜宧ttp webSettings.setMixedContentMode(WebSettings.MIXED_CONTENT_ALWAYS_ALLOW); - mWebView.setWebChromeClient(new WebChromeClient(){ + mWebView.setWebChromeClient(new WebChromeClient() { @Override public boolean onConsoleMessage(ConsoleMessage consoleMessage) { @@ -175,7 +178,7 @@ public boolean onJsAlert(WebView view, String url, String message, JsResult resu @Override public boolean shouldOverrideUrlLoading(WebView view, WebResourceRequest request) { - if(!request.getUrl().toString().equals(getBrowserUrl())){ + if (!request.getUrl().toString().equals(getBrowserUrl())) { openWithDefaultBrowser(request.getUrl().toString()); return true; } @@ -212,7 +215,7 @@ public void onReceivedError(WebView view, WebResourceRequest request, WebResourc mWebView.setWebContentsDebuggingEnabled(true); } - private void releaseWebView(){ + private void releaseWebView() { try { Field sConfigCallback = Class.forName("android.webkit.BrowserFrame").getDeclaredField("sConfigCallback"); if (sConfigCallback != null) { @@ -261,6 +264,11 @@ private void openWithDefaultBrowser(String url) { startActivity(intent); } + /** + * Gets browser url. + * + * @return the browser url + */ protected abstract String getBrowserUrl(); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseFragment.java index 88a870780..3dde5fc35 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/BaseFragment.java @@ -15,7 +15,13 @@ import androidx.fragment.app.Fragment; import androidx.navigation.Navigation; +/** + * The type Base fragment. + */ public class BaseFragment extends Fragment { + /** + * The Handler. + */ protected Handler handler; private AlertDialog mAlertDialog; private String mAlertMessage; @@ -45,11 +51,22 @@ public void onDetach() { onBackPressedCallback.setEnabled(false); } + /** + * Show alert. + * + * @param message the message + */ protected void showAlert(String message) { this.showAlert(message, true); } - protected void showAlert(String message, boolean showRepeatMsg){ + /** + * Show alert. + * + * @param message the message + * @param showRepeatMsg the show repeat msg + */ + protected void showAlert(String message, boolean showRepeatMsg) { runOnUIThread(() -> { Context context = getContext(); if (context == null) { @@ -60,7 +77,7 @@ protected void showAlert(String message, boolean showRepeatMsg){ .setPositiveButton("OK", (dialog, which) -> dialog.dismiss()) .create(); } - if(!showRepeatMsg && !TextUtils.isEmpty(mAlertMessage) && mAlertMessage.equals(message)){ + if (!showRepeatMsg && !TextUtils.isEmpty(mAlertMessage) && mAlertMessage.equals(message)) { return; } mAlertMessage = message; @@ -69,10 +86,18 @@ protected void showAlert(String message, boolean showRepeatMsg){ }); } - protected void resetAlert(){ + /** + * Reset alert. + */ + protected void resetAlert() { runOnUIThread(() -> mAlertMessage = ""); } + /** + * Show long toast. + * + * @param msg the msg + */ protected final void showLongToast(final String msg) { runOnUIThread(() -> { Context context = getContext(); @@ -83,6 +108,11 @@ protected final void showLongToast(final String msg) { }); } + /** + * Show short toast. + * + * @param msg the msg + */ protected final void showShortToast(final String msg) { runOnUIThread(() -> { Context context = getContext(); @@ -93,10 +123,21 @@ protected final void showShortToast(final String msg) { }); } + /** + * Run on ui thread. + * + * @param runnable the runnable + */ protected final void runOnUIThread(Runnable runnable) { this.runOnUIThread(runnable, 0); } + /** + * Run on ui thread. + * + * @param runnable the runnable + * @param delay the delay + */ protected final void runOnUIThread(Runnable runnable, long delay) { if (handler != null && runnable != null && getContext() != null) { if (delay <= 0 && handler.getLooper().getThread() == Thread.currentThread()) { @@ -121,6 +162,9 @@ public void onDestroy() { } } + /** + * On back pressed. + */ protected void onBackPressed() { View view = getView(); if (view != null) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/Constant.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/Constant.java index ad4af3033..340dadf85 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/Constant.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/Constant.java @@ -1,32 +1,57 @@ package io.agora.api.example.common; -import android.view.TextureView; +/** + * The type Constant. + */ +public final class Constant { -import io.agora.rtc2.RtcEngine; + private Constant() { -public class Constant { - public static TextureView TEXTUREVIEW; + } - public static RtcEngine ENGINE; - - public static String TIPS = "tips"; - - public static String DATA = "data"; + /** + * The constant DATA. + */ + public static final String DATA = "data"; + /** + * The constant MIX_FILE_PATH. + */ public static final String MIX_FILE_PATH = "/assets/music_1.m4a"; + /** + * The constant EFFECT_FILE_PATH. + */ public static final String EFFECT_FILE_PATH = "https://webdemo.agora.io/ding.mp3"; + /** + * The constant WATER_MARK_FILE_PATH. + */ public static final String WATER_MARK_FILE_PATH = "/assets/agora-logo.png"; + /** + * The constant URL_PLAY_AUDIO_FILES_MULTI_TRACK. + */ public static final String URL_PLAY_AUDIO_FILES_MULTI_TRACK = "https://webdemo.agora.io/mTrack.m4a"; + /** + * The constant URL_PLAY_AUDIO_FILES. + */ public static final String URL_PLAY_AUDIO_FILES = "https://webdemo.agora.io/audiomixing.mp3"; + /** + * The constant URL_UPBEAT. + */ public static final String URL_UPBEAT = "https://webdemo.agora.io/ding.mp3"; + /** + * The constant URL_DOWNBEAT. + */ public static final String URL_DOWNBEAT = "https://webdemo.agora.io/dang.mp3"; - public static final String URL_VIDEO_SAMPLE= "http://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/resources/sample.mp4"; + /** + * The constant URL_VIDEO_SAMPLE. + */ + public static final String URL_VIDEO_SAMPLE = "http://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/resources/sample.mp4"; } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/adapter/ExampleSection.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/adapter/ExampleSection.java index 31505545d..775f1ee35 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/adapter/ExampleSection.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/adapter/ExampleSection.java @@ -13,11 +13,21 @@ import io.github.luizgrp.sectionedrecyclerviewadapter.Section; import io.github.luizgrp.sectionedrecyclerviewadapter.SectionParameters; +/** + * The type Example section. + */ public class ExampleSection extends Section { private final String mTitle; private final List mValues; private final MainFragment.OnListFragmentInteractionListener mListener; + /** + * Instantiates a new Example section. + * + * @param title the title + * @param items the items + * @param listener the listener + */ public ExampleSection(String title, List items, MainFragment.OnListFragmentInteractionListener listener) { super(SectionParameters.builder().headerResourceId(R.layout.layout_main_list_section).itemResourceId(R.layout.layout_main_list_item).build()); mTitle = title; @@ -65,11 +75,28 @@ public void onBindHeaderViewHolder(RecyclerView.ViewHolder viewHolder) { } } + /** + * The type View holder. + */ static class ViewHolder extends RecyclerView.ViewHolder { + /** + * The M view. + */ final View mView; + /** + * The M name view. + */ final TextView mNameView; + /** + * The M item. + */ Example mItem; + /** + * Instantiates a new View holder. + * + * @param view the view + */ ViewHolder(View view) { super(view); mView = view; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/AVCallFloatView.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/AVCallFloatView.java index 531511849..44faabb7b 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/AVCallFloatView.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/AVCallFloatView.java @@ -13,6 +13,9 @@ import android.view.animation.Interpolator; import android.widget.FrameLayout; +/** + * The type Av call float view. + */ public class AVCallFloatView extends FrameLayout { private static final String TAG = "AVCallFloatView"; @@ -51,6 +54,11 @@ public class AVCallFloatView extends FrameLayout { private WindowManager.LayoutParams mParams = null; + /** + * Instantiates a new Av call float view. + * + * @param context the context + */ public AVCallFloatView(Context context) { super(context); initView(); @@ -60,10 +68,20 @@ private void initView() { windowManager = (WindowManager) getContext().getSystemService(Context.WINDOW_SERVICE); } + /** + * Sets params. + * + * @param params the params + */ public void setParams(WindowManager.LayoutParams params) { mParams = params; } + /** + * Sets is showing. + * + * @param isShowing the is showing + */ public void setIsShowing(boolean isShowing) { this.isShowing = isShowing; } @@ -92,7 +110,7 @@ public boolean onTouchEvent(MotionEvent event) { if (Math.abs(xDownInScreen - xInScreen) <= ViewConfiguration.get(getContext()).getScaledTouchSlop() && Math.abs(yDownInScreen - yInScreen) <= ViewConfiguration.get(getContext()).getScaledTouchSlop()) { // 鐐瑰嚮鏁堟灉 - // Toast.makeText(getContext(), "this float window is clicked", Toast.LENGTH_SHORT).show(); + Log.d(TAG, "this float window is clicked"); } else { //鍚搁檮鏁堟灉 anchorToSide(); @@ -119,44 +137,44 @@ private void anchorToSide() { int dp_25 = dp2px(15); - //1 - if (middleX <= dp_25 + getWidth() / 2) { + + if (middleX <= dp_25 + getWidth() / 2) { //1 xDistance = dp_25 - mParams.x; - } - //2 - else if (middleX <= screenWidth / 2) { + } else if (middleX <= screenWidth / 2) { //2 xDistance = dp_25 - mParams.x; - } - //3 - else if (middleX >= screenWidth - getWidth() / 2 - dp_25) { + } else if (middleX >= screenWidth - getWidth() / 2 - dp_25) { //3 xDistance = screenWidth - mParams.x - getWidth() - dp_25; - } - //4 - else { + } else { //4 xDistance = screenWidth - mParams.x - getWidth() - dp_25; } - //1 - if (mParams.y < dp_25) { + + if (mParams.y < dp_25) { //1 yDistance = dp_25 - mParams.y; - } - //2 - else if (mParams.y + getHeight() + dp_25 >= screenHeight) { + } else if (mParams.y + getHeight() + dp_25 >= screenHeight) { //2 yDistance = screenHeight - dp_25 - mParams.y - getHeight(); } Log.e(TAG, "xDistance " + xDistance + " yDistance" + yDistance); - animTime = Math.abs(xDistance) > Math.abs(yDistance) ? (int) (((float) xDistance / (float) screenWidth) * 600f) - : (int) (((float) yDistance / (float) screenHeight) * 900f); + final float animFactorWidth = 600f; + final float animFactorHeight = 900f; + animTime = Math.abs(xDistance) > Math.abs(yDistance) ? (int) (((float) xDistance / (float) screenWidth) * animFactorWidth) + : (int) (((float) yDistance / (float) screenHeight) * animFactorHeight); this.post(new AnchorAnimRunnable(Math.abs(animTime), xDistance, yDistance, System.currentTimeMillis())); } - public int dp2px(float dp){ + /** + * Dp 2 px int. + * + * @param dp the dp + * @return the int + */ + public int dp2px(float dp) { final float scale = getContext().getResources().getDisplayMetrics().density; return (int) (dp * scale + 0.5f); } - private class AnchorAnimRunnable implements Runnable { + private final class AnchorAnimRunnable implements Runnable { private int animTime; private long currentStartTime; @@ -166,7 +184,7 @@ private class AnchorAnimRunnable implements Runnable { private int startX; private int startY; - public AnchorAnimRunnable(int animTime, int xDistance, int yDistance, long currentStartTime) { + private AnchorAnimRunnable(int animTime, int xDistance, int yDistance, long currentStartTime) { this.animTime = animTime; this.currentStartTime = currentStartTime; interpolator = new AccelerateDecelerateInterpolator(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/FloatWindowHelper.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/FloatWindowHelper.java index ea3f24ef5..97ff0c671 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/FloatWindowHelper.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/FloatWindowHelper.java @@ -25,10 +25,25 @@ import io.agora.api.example.common.floatwindow.rom.QikuUtils; import io.agora.api.example.common.floatwindow.rom.RomUtils; -public class FloatWindowHelper { +/** + * The type Float window helper. + */ +public final class FloatWindowHelper { private static final String TAG = "FloatWindowHelper"; - public static AVCallFloatView createFloatView(@NonNull Context context, int xDp, int yDp){ + private FloatWindowHelper() { + + } + + /** + * Create float view av call float view. + * + * @param context the context + * @param xDp the x dp + * @param yDp the y dp + * @return the av call float view + */ + public static AVCallFloatView createFloatView(@NonNull Context context, int xDp, int yDp) { WindowManager windowManager = (WindowManager) context.getApplicationContext().getSystemService(Context.WINDOW_SERVICE); Point size = new Point(); windowManager.getDefaultDisplay().getSize(size); @@ -61,12 +76,23 @@ public static AVCallFloatView createFloatView(@NonNull Context context, int xDp, return floatView; } + /** + * Destroy float view. + * + * @param floatView the float view + */ public static void destroyFloatView(@NonNull AVCallFloatView floatView) { WindowManager windowManager = (WindowManager) floatView.getContext().getApplicationContext().getSystemService(Context.WINDOW_SERVICE); floatView.setIsShowing(false); windowManager.removeViewImmediate(floatView); } + /** + * Check permission boolean. + * + * @param context the context + * @return the boolean + */ public static boolean checkPermission(Context context) { //6.0 鐗堟湰涔嬪悗鐢变簬 google 澧炲姞浜嗗鎮诞绐楁潈闄愮殑绠$悊锛屾墍浠ユ柟寮忓氨缁熶竴浜 if (Build.VERSION.SDK_INT < 23) { @@ -85,6 +111,11 @@ public static boolean checkPermission(Context context) { return commonROMPermissionCheck(context); } + /** + * Apply permission. + * + * @param context the context + */ public static void applyPermission(Context context) { if (Build.VERSION.SDK_INT < 23) { if (RomUtils.checkIsMiuiRom()) { @@ -94,7 +125,7 @@ public static void applyPermission(Context context) { } else if (RomUtils.checkIsHuaweiRom()) { huaweiROMPermissionApply(context); } else if (RomUtils.checkIs360Rom()) { - ROM360PermissionApply(context); + rom360Permissionapply(context); } else if (RomUtils.checkIsOppoRom()) { oppoROMPermissionApply(context); } @@ -103,7 +134,14 @@ public static void applyPermission(Context context) { } } - public static int dp2px(Context context, float dp){ + /** + * Dp 2 px int. + * + * @param context the context + * @param dp the dp + * @return the int + */ + public static int dp2px(Context context, float dp) { final float scale = context.getResources().getDisplayMetrics().density; return (int) (dp * scale + 0.5f); } @@ -148,7 +186,7 @@ private static boolean commonROMPermissionCheck(Context context) { } } - private static void ROM360PermissionApply(final Context context) { + private static void rom360Permissionapply(final Context context) { showConfirmDialog(context, () -> QikuUtils.applyPermission(context)); } @@ -176,6 +214,8 @@ private static void oppoROMPermissionApply(final Context context) { /** * 閫氱敤 rom 鏉冮檺鐢宠 + * + * @param context Context. */ private static void commonROMPermissionApply(final Context context) { //杩欓噷涔熶竴鏍凤紝榄呮棌绯荤粺闇瑕佸崟鐙傞厤 @@ -210,7 +250,7 @@ private static void showConfirmDialog(@NonNull Context context, final Runnable c .setMessage(R.string.float_window_confirm_dialog_msg) .setPositiveButton(R.string.float_window_confirm_dialog_confirm, (dialog, which) -> { - if(confirm != null){ + if (confirm != null) { confirm.run(); } dialog.dismiss(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/HuaweiUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/HuaweiUtils.java index bda6e2840..5315f0cfb 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/HuaweiUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/HuaweiUtils.java @@ -16,11 +16,21 @@ import java.lang.reflect.Method; -public class HuaweiUtils { +/** + * The type Huawei utils. + */ +public final class HuaweiUtils { private static final String TAG = "HuaweiUtils"; + private HuaweiUtils() { + + } + /** * 妫娴 Huawei 鎮诞绐楁潈闄 + * + * @param context the context + * @return the boolean */ public static boolean checkFloatWindowPermission(Context context) { final int version = Build.VERSION.SDK_INT; @@ -32,6 +42,8 @@ public static boolean checkFloatWindowPermission(Context context) { /** * 鍘诲崕涓烘潈闄愮敵璇烽〉闈 + * + * @param context the context */ public static void applyPermission(Context context) { try { @@ -40,14 +52,15 @@ public static void applyPermission(Context context) { // ComponentName comp = new ComponentName("com.huawei.systemmanager","com.huawei.permissionmanager.ui.MainActivity");//鍗庝负鏉冮檺绠$悊 // ComponentName comp = new ComponentName("com.huawei.systemmanager", // "com.huawei.permissionmanager.ui.SingleAppActivity");//鍗庝负鏉冮檺绠$悊锛岃烦杞埌鎸囧畾app鐨勬潈闄愮鐞嗕綅缃渶瑕佸崕涓烘帴鍙f潈闄愶紝鏈В鍐 - ComponentName comp = new ComponentName("com.huawei.systemmanager", "com.huawei.systemmanager.addviewmonitor.AddViewMonitorActivity");//鎮诞绐楃鐞嗛〉闈 + ComponentName comp = new ComponentName("com.huawei.systemmanager", "com.huawei.systemmanager.addviewmonitor.AddViewMonitorActivity"); //鎮诞绐楃鐞嗛〉闈 intent.setComponent(comp); - if (RomUtils.getEmuiVersion() == 3.1) { + final double versionDiff = 3.1; + if (RomUtils.getEmuiVersion() == versionDiff) { //emui 3.1 鐨勯傞厤 context.startActivity(intent); } else { //emui 3.0 鐨勯傞厤 - comp = new ComponentName("com.huawei.systemmanager", "com.huawei.notificationmanager.ui.NotificationManagmentActivity");//鎮诞绐楃鐞嗛〉闈 + comp = new ComponentName("com.huawei.systemmanager", "com.huawei.notificationmanager.ui.NotificationManagmentActivity"); //鎮诞绐楃鐞嗛〉闈 intent.setComponent(comp); context.startActivity(intent); } @@ -56,20 +69,20 @@ public static void applyPermission(Context context) { intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK); // ComponentName comp = new ComponentName("com.huawei.systemmanager","com.huawei.permissionmanager.ui.MainActivity");//鍗庝负鏉冮檺绠$悊 ComponentName comp = new ComponentName("com.huawei.systemmanager", - "com.huawei.permissionmanager.ui.MainActivity");//鍗庝负鏉冮檺绠$悊锛岃烦杞埌鏈琣pp鐨勬潈闄愮鐞嗛〉闈,杩欎釜闇瑕佸崕涓烘帴鍙f潈闄愶紝鏈В鍐 + "com.huawei.permissionmanager.ui.MainActivity"); //鍗庝负鏉冮檺绠$悊锛岃烦杞埌鏈琣pp鐨勬潈闄愮鐞嗛〉闈,杩欎釜闇瑕佸崕涓烘帴鍙f潈闄愶紝鏈В鍐 // ComponentName comp = new ComponentName("com.huawei.systemmanager","com.huawei.systemmanager.addviewmonitor.AddViewMonitorActivity");//鎮诞绐楃鐞嗛〉闈 intent.setComponent(comp); context.startActivity(intent); Log.e(TAG, Log.getStackTraceString(e)); } catch (ActivityNotFoundException e) { - /** + /* * 鎵嬫満绠″鐗堟湰杈冧綆 HUAWEI SC-UL10 */ // Toast.makeText(MainActivity.this, "act鎵句笉鍒", Toast.LENGTH_LONG).show(); Intent intent = new Intent(); intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK); - ComponentName comp = new ComponentName("com.Android.settings", "com.android.settings.permission.TabItem");//鏉冮檺绠$悊椤甸潰 android4.4 -// ComponentName comp = new ComponentName("com.android.settings","com.android.settings.permission.single_app_activity");//姝ゅ鍙烦杞埌鎸囧畾app瀵瑰簲鐨勬潈闄愮鐞嗛〉闈紝浣嗘槸闇瑕佺浉鍏虫潈闄愶紝鏈В鍐 + ComponentName comp = new ComponentName("com.Android.settings", "com.android.settings.permission.TabItem"); //鏉冮檺绠$悊椤甸潰 android4.4 +// ComponentName comp = new ComponentName("com.android.settings","com.android.settings.permission.single_app_activity"); //姝ゅ鍙烦杞埌鎸囧畾app瀵瑰簲鐨勬潈闄愮鐞嗛〉闈紝浣嗘槸闇瑕佺浉鍏虫潈闄愶紝鏈В鍐 intent.setComponent(comp); context.startActivity(intent); e.printStackTrace(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MeizuUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MeizuUtils.java index 0ae1cb6b2..e91fd167c 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MeizuUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MeizuUtils.java @@ -13,11 +13,21 @@ import java.lang.reflect.Method; -public class MeizuUtils { +/** + * The type Meizu utils. + */ +public final class MeizuUtils { private static final String TAG = "MeizuUtils"; + private MeizuUtils() { + + } + /** * 妫娴 meizu 鎮诞绐楁潈闄 + * + * @param context the context + * @return the boolean */ public static boolean checkFloatWindowPermission(Context context) { final int version = Build.VERSION.SDK_INT; @@ -29,6 +39,9 @@ public static boolean checkFloatWindowPermission(Context context) { /** * 鍘婚瓍鏃忔潈闄愮敵璇烽〉闈 + * + * @param context the context + * @param errHandler the err handler */ public static void applyPermission(Context context, Runnable errHandler) { try { @@ -37,12 +50,12 @@ public static void applyPermission(Context context, Runnable errHandler) { intent.putExtra("packageName", context.getPackageName()); intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK); context.startActivity(intent); - }catch (Exception e) { + } catch (Exception e) { try { Log.e(TAG, "鑾峰彇鎮诞绐楁潈闄, 鎵撳紑AppSecActivity澶辫触, " + Log.getStackTraceString(e)); // 鏈鏂扮殑榄呮棌flyme 6.2.5 鐢ㄤ笂杩版柟娉曡幏鍙栨潈闄愬け璐, 涓嶈繃鍙堝彲浠ョ敤涓嬭堪鏂规硶鑾峰彇鏉冮檺浜 // FloatWindowManager.commonROMPermissionApplyInternal(context); - if(errHandler != null){ + if (errHandler != null) { errHandler.run(); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MiuiUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MiuiUtils.java index 9d016b779..9130e1de3 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MiuiUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/MiuiUtils.java @@ -16,9 +16,16 @@ import java.lang.reflect.Method; -public class MiuiUtils { +/** + * The type Miui utils. + */ +public final class MiuiUtils { private static final String TAG = "MiuiUtils"; + private MiuiUtils() { + + } + /** * 鑾峰彇灏忕背 rom 鐗堟湰鍙凤紝鑾峰彇澶辫触杩斿洖 -1 * @@ -39,6 +46,9 @@ public static int getMiuiVersion() { /** * 妫娴 miui 鎮诞绐楁潈闄 + * + * @param context the context + * @return the boolean */ public static boolean checkFloatWindowPermission(Context context) { final int version = Build.VERSION.SDK_INT; @@ -75,17 +85,19 @@ private static boolean checkOp(Context context, int op) { /** * 灏忕背 ROM 鏉冮檺鐢宠 + * + * @param context the context */ public static void applyMiuiPermission(Context context) { int versionCode = getMiuiVersion(); if (versionCode == 5) { - goToMiuiPermissionActivity_V5(context); + goToMiuiPermissionActivityV5(context); } else if (versionCode == 6) { - goToMiuiPermissionActivity_V6(context); + goToMiuiPermissionActivityV6(context); } else if (versionCode == 7) { - goToMiuiPermissionActivity_V7(context); + goToMiuiPermissionActivityV7(context); } else if (versionCode == 8) { - goToMiuiPermissionActivity_V8(context); + goToMiuiPermissionActivityV8(context); } else { Log.e(TAG, "this is a special MIUI rom version, its version code " + versionCode); } @@ -100,8 +112,10 @@ private static boolean isIntentAvailable(Intent intent, Context context) { /** * 灏忕背 V5 鐗堟湰 ROM鏉冮檺鐢宠 + * + * @param context the context */ - public static void goToMiuiPermissionActivity_V5(Context context) { + public static void goToMiuiPermissionActivityV5(Context context) { Intent intent = null; String packageName = context.getPackageName(); intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS); @@ -135,8 +149,10 @@ public static void goToMiuiPermissionActivity_V5(Context context) { /** * 灏忕背 V6 鐗堟湰 ROM鏉冮檺鐢宠 + * + * @param context the context */ - public static void goToMiuiPermissionActivity_V6(Context context) { + public static void goToMiuiPermissionActivityV6(Context context) { Intent intent = new Intent("miui.intent.action.APP_PERM_EDITOR"); intent.setClassName("com.miui.securitycenter", "com.miui.permcenter.permissions.AppPermissionsEditorActivity"); intent.putExtra("extra_pkgname", context.getPackageName()); @@ -151,8 +167,10 @@ public static void goToMiuiPermissionActivity_V6(Context context) { /** * 灏忕背 V7 鐗堟湰 ROM鏉冮檺鐢宠 + * + * @param context the context */ - public static void goToMiuiPermissionActivity_V7(Context context) { + public static void goToMiuiPermissionActivityV7(Context context) { Intent intent = new Intent("miui.intent.action.APP_PERM_EDITOR"); intent.setClassName("com.miui.securitycenter", "com.miui.permcenter.permissions.AppPermissionsEditorActivity"); intent.putExtra("extra_pkgname", context.getPackageName()); @@ -167,8 +185,10 @@ public static void goToMiuiPermissionActivity_V7(Context context) { /** * 灏忕背 V8 鐗堟湰 ROM鏉冮檺鐢宠 + * + * @param context the context */ - public static void goToMiuiPermissionActivity_V8(Context context) { + public static void goToMiuiPermissionActivityV8(Context context) { Intent intent = new Intent("miui.intent.action.APP_PERM_EDITOR"); intent.setClassName("com.miui.securitycenter", "com.miui.permcenter.permissions.PermissionsEditorActivity"); // intent.setPackage("com.miui.securitycenter"); @@ -182,7 +202,7 @@ public static void goToMiuiPermissionActivity_V8(Context context) { intent.setPackage("com.miui.securitycenter"); intent.putExtra("extra_pkgname", context.getPackageName()); intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK); - + if (isIntentAvailable(intent, context)) { context.startActivity(intent); } else { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/OppoUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/OppoUtils.java index 9c4bf6252..4a24ed729 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/OppoUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/OppoUtils.java @@ -11,12 +11,22 @@ import java.lang.reflect.Method; -public class OppoUtils { +/** + * The type Oppo utils. + */ +public final class OppoUtils { private static final String TAG = "OppoUtils"; + private OppoUtils() { + + } + /** * 妫娴 360 鎮诞绐楁潈闄 + * + * @param context the context + * @return the boolean */ public static boolean checkFloatWindowPermission(Context context) { final int version = Build.VERSION.SDK_INT; @@ -46,6 +56,8 @@ private static boolean checkOp(Context context, int op) { /** * oppo ROM 鏉冮檺鐢宠 + * + * @param context the context */ public static void applyOppoPermission(Context context) { //merge request from https://github.com/zhaozepeng/FloatWindowPermission/pull/26 @@ -53,11 +65,10 @@ public static void applyOppoPermission(Context context) { Intent intent = new Intent(); intent.setFlags(Intent.FLAG_ACTIVITY_NEW_TASK); //com.coloros.safecenter/.sysfloatwindow.FloatWindowListActivity - ComponentName comp = new ComponentName("com.coloros.safecenter", "com.coloros.safecenter.sysfloatwindow.FloatWindowListActivity");//鎮诞绐楃鐞嗛〉闈 + ComponentName comp = new ComponentName("com.coloros.safecenter", "com.coloros.safecenter.sysfloatwindow.FloatWindowListActivity"); //鎮诞绐楃鐞嗛〉闈 intent.setComponent(comp); context.startActivity(intent); - } - catch(Exception e){ + } catch (Exception e) { e.printStackTrace(); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/QikuUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/QikuUtils.java index 1147d94b5..bfd5c43d1 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/QikuUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/QikuUtils.java @@ -14,11 +14,21 @@ import java.lang.reflect.Method; -public class QikuUtils { +/** + * The type Qiku utils. + */ +public final class QikuUtils { private static final String TAG = "QikuUtils"; + private QikuUtils() { + + } + /** * 妫娴 360 鎮诞绐楁潈闄 + * + * @param context the context + * @return the boolean */ public static boolean checkFloatWindowPermission(Context context) { final int version = Build.VERSION.SDK_INT; @@ -36,7 +46,7 @@ private static boolean checkOp(Context context, int op) { try { Class clazz = AppOpsManager.class; Method method = clazz.getDeclaredMethod("checkOp", int.class, int.class, String.class); - return AppOpsManager.MODE_ALLOWED == (int)method.invoke(manager, op, Binder.getCallingUid(), context.getPackageName()); + return AppOpsManager.MODE_ALLOWED == (int) method.invoke(manager, op, Binder.getCallingUid(), context.getPackageName()); } catch (Exception e) { Log.e(TAG, Log.getStackTraceString(e)); } @@ -48,6 +58,8 @@ private static boolean checkOp(Context context, int op) { /** * 鍘360鏉冮檺鐢宠椤甸潰 + * + * @param context the context */ public static void applyPermission(Context context) { Intent intent = new Intent(); @@ -60,8 +72,8 @@ public static void applyPermission(Context context) { if (isIntentAvailable(intent, context)) { context.startActivity(intent); } else { - Log.e(TAG, "can't open permission page with particular name, please use " + - "\"adb shell dumpsys activity\" command and tell me the name of the float window permission page"); + Log.e(TAG, "can't open permission page with particular name, please use " + + "\"adb shell dumpsys activity\" command and tell me the name of the float window permission page"); } } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/RomUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/RomUtils.java index 20c08d9b6..536b912e4 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/RomUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/floatwindow/rom/RomUtils.java @@ -11,14 +11,23 @@ import java.io.IOException; import java.io.InputStreamReader; -public class RomUtils { +/** + * The type Rom utils. + */ +public final class RomUtils { private static final String TAG = "RomUtils"; + private RomUtils() { + + } + /** * 鑾峰彇 emui 鐗堟湰鍙 - * @return + * + * @return emui version */ public static double getEmuiVersion() { + final double defaultVersion = 4.0; try { String emuiVersion = getSystemProperty("ro.build.version.emui"); String version = emuiVersion.substring(emuiVersion.indexOf("_") + 1); @@ -26,7 +35,7 @@ public static double getEmuiVersion() { } catch (Exception e) { e.printStackTrace(); } - return 4.0; + return defaultVersion; } /** @@ -46,6 +55,12 @@ public static int getMiuiVersion() { return -1; } + /** + * Gets system property. + * + * @param propName the prop name + * @return the system property + */ public static String getSystemProperty(String propName) { String line; BufferedReader input = null; @@ -68,35 +83,56 @@ public static String getSystemProperty(String propName) { } return line; } + + /** + * Check is huawei rom boolean. + * + * @return the boolean + */ public static boolean checkIsHuaweiRom() { return Build.MANUFACTURER.contains("HUAWEI"); } /** * check if is miui ROM + * + * @return the boolean */ public static boolean checkIsMiuiRom() { return !TextUtils.isEmpty(getSystemProperty("ro.miui.ui.version.name")); } + /** + * Check is meizu rom boolean. + * + * @return the boolean + */ public static boolean checkIsMeizuRom() { //return Build.MANUFACTURER.contains("Meizu"); - String meizuFlymeOSFlag = getSystemProperty("ro.build.display.id"); - if (TextUtils.isEmpty(meizuFlymeOSFlag)){ - return false; - }else if (meizuFlymeOSFlag.contains("flyme") || meizuFlymeOSFlag.toLowerCase().contains("flyme")){ - return true; - }else { + String meizuFlymeOSFlag = getSystemProperty("ro.build.display.id"); + if (TextUtils.isEmpty(meizuFlymeOSFlag)) { return false; + } else { + return meizuFlymeOSFlag.contains("flyme") || meizuFlymeOSFlag.toLowerCase().contains("flyme"); } } + /** + * Check is 360 rom boolean. + * + * @return the boolean + */ public static boolean checkIs360Rom() { //fix issue https://github.com/zhaozepeng/FloatWindowPermission/issues/9 return Build.MANUFACTURER.contains("QiKU") || Build.MANUFACTURER.contains("360"); } + /** + * Check is oppo rom boolean. + * + * @return the boolean + */ public static boolean checkIsOppoRom() { //https://github.com/zhaozepeng/FloatWindowPermission/pull/26 return Build.MANUFACTURER.contains("OPPO") || Build.MANUFACTURER.contains("oppo"); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dFull.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dFull.java index 8db5cb290..0b52e5f64 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dFull.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dFull.java @@ -31,19 +31,22 @@ public class Drawable2dFull extends Drawable2d { * The texture coordinates are Y-inverted relative to RECTANGLE. (This seems to work out * right with external textures from SurfaceTexture.) */ - private static final float FULL_RECTANGLE_COORDS[] = { + private static final float[] FULL_RECTANGLE_COORDS = { -1.0f, -1.0f, // 0 bottom left 1.0f, -1.0f, // 1 bottom right -1.0f, 1.0f, // 2 top left 1.0f, 1.0f, // 3 top right }; - private static final float FULL_RECTANGLE_TEX_COORDS[] = { + private static final float[] FULL_RECTANGLE_TEX_COORDS = { 0.0f, 0.0f, // 0 bottom left 1.0f, 0.0f, // 1 bottom right 0.0f, 1.0f, // 2 top left 1.0f, 1.0f // 3 top right }; + /** + * Instantiates a new Drawable 2 d full. + */ public Drawable2dFull() { super(FULL_RECTANGLE_COORDS, FULL_RECTANGLE_TEX_COORDS); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dLandmarks.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dLandmarks.java deleted file mode 100644 index 29d798b12..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/Drawable2dLandmarks.java +++ /dev/null @@ -1,33 +0,0 @@ -/* - * Copyright 2014 Google Inc. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package io.agora.api.example.common.gles; - - -import io.agora.api.example.common.gles.core.Drawable2d; - -/** - * Base class for stuff we like to draw. - */ -public class Drawable2dLandmarks extends Drawable2d { - - - private float pointsCoords[] = new float[150]; - - public Drawable2dLandmarks() { - updateVertexArray(pointsCoords); - } -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLTestUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLTestUtils.java deleted file mode 100644 index f909044d2..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLTestUtils.java +++ /dev/null @@ -1,125 +0,0 @@ -package io.agora.api.example.common.gles; - -import android.graphics.Bitmap; -import android.graphics.BitmapFactory; -import android.graphics.ImageFormat; -import android.graphics.Rect; -import android.graphics.YuvImage; -import android.opengl.GLES11Ext; -import android.opengl.GLES20; -import android.util.Log; - -import java.io.ByteArrayOutputStream; -import java.io.IOException; -import java.nio.ByteBuffer; -import java.nio.IntBuffer; - -public class GLTestUtils { - private static final String TAG = "GLUtils"; - - public static Bitmap getTexture2DImage(int textureID, int width, int height) { - try { - int[] oldFboId = new int[1]; - GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, IntBuffer.wrap(oldFboId)); - - int[] framebuffers = new int[1]; - GLES20.glGenFramebuffers(1, framebuffers, 0); - int framebufferId = framebuffers[0]; - GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebufferId); - - int[] renderbuffers = new int[1]; - GLES20.glGenRenderbuffers(1, renderbuffers, 0); - int renderId = renderbuffers[0]; - GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderId); - GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, width, height); - - GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES20.GL_TEXTURE_2D, textureID, 0); - GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderId); - if (GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE) { - Log.d(TAG, "Framebuffer error"); - } - - ByteBuffer rgbaBuf = ByteBuffer.allocateDirect(width * height * 4); - rgbaBuf.position(0); - GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, rgbaBuf); - - Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888); - bitmap.copyPixelsFromBuffer(rgbaBuf); - - GLES20.glDeleteRenderbuffers(1, IntBuffer.wrap(framebuffers)); - GLES20.glDeleteFramebuffers(1, IntBuffer.allocate(framebufferId)); - - GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, oldFboId[0]); - - return bitmap; - } catch (Exception e) { - Log.e(TAG, "", e); - } - return null; - } - - public static Bitmap getTextureOESImage(int textureID, int width, int height) { - try { - int[] oldFboId = new int[1]; - GLES20.glGetIntegerv(GLES20.GL_FRAMEBUFFER_BINDING, IntBuffer.wrap(oldFboId)); - - int[] framebuffers = new int[1]; - GLES20.glGenFramebuffers(1, framebuffers, 0); - int framebufferId = framebuffers[0]; - GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebufferId); - - int[] renderbuffers = new int[1]; - GLES20.glGenRenderbuffers(1, renderbuffers, 0); - int renderId = renderbuffers[0]; - GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderId); - GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, width, height); - - GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0, GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureID, 0); - GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT, GLES20.GL_RENDERBUFFER, renderId); - if (GLES20.glCheckFramebufferStatus(GLES20.GL_FRAMEBUFFER) != GLES20.GL_FRAMEBUFFER_COMPLETE) { - Log.d(TAG, "Framebuffer error"); - } - - ByteBuffer rgbaBuf = ByteBuffer.allocateDirect(width * height * 4); - rgbaBuf.position(0); - GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, rgbaBuf); - - Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888); - bitmap.copyPixelsFromBuffer(rgbaBuf); - - GLES20.glDeleteRenderbuffers(1, IntBuffer.wrap(framebuffers)); - GLES20.glDeleteFramebuffers(1, IntBuffer.allocate(framebufferId)); - - GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, oldFboId[0]); - - return bitmap; - } catch (Exception e) { - Log.e(TAG, "", e); - } - return null; - } - - private static Bitmap nv21ToBitmap(byte[] nv21, int width, int height) { - Bitmap bitmap = null; - try { - YuvImage image = new YuvImage(nv21, ImageFormat.NV21, width, height, null); - ByteArrayOutputStream stream = new ByteArrayOutputStream(); - image.compressToJpeg(new Rect(0, 0, width, height), 80, stream); - bitmap = BitmapFactory.decodeByteArray(stream.toByteArray(), 0, stream.size()); - stream.close(); - } catch (IOException e) { - e.printStackTrace(); - } - return bitmap; - } - - private static Bitmap readBitmap(int width, int height){ - ByteBuffer rgbaBuf = ByteBuffer.allocateDirect(width * height * 4); - rgbaBuf.position(0); - GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, rgbaBuf); - - Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888); - bitmap.copyPixelsFromBuffer(rgbaBuf); - return bitmap; - } -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLThread.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLThread.java index daeb29a38..acf6d5dbc 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLThread.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/GLThread.java @@ -5,6 +5,9 @@ import java.lang.annotation.RetentionPolicy; import java.lang.annotation.Target; +/** + * The interface Gl thread. + */ @Target(ElementType.METHOD) @Retention(RetentionPolicy.RUNTIME) public @interface GLThread { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramLandmarks.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramLandmarks.java deleted file mode 100644 index 541d6e3c2..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramLandmarks.java +++ /dev/null @@ -1,143 +0,0 @@ -/* - * Copyright (C) 2011 The Android Open Source Project - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ -package io.agora.api.example.common.gles; - -import android.hardware.Camera; -import android.opengl.GLES20; -import android.opengl.Matrix; - -import java.util.Arrays; - -import io.agora.api.example.common.gles.core.Drawable2d; -import io.agora.api.example.common.gles.core.GlUtil; -import io.agora.api.example.common.gles.core.Program; - - -public class ProgramLandmarks extends Program { - - private static final String vertexShaderCode = - // This matrix member variable provides a hook to manipulate - // the coordinates of the objects that use this vertex shader - "uniform mat4 uMVPMatrix;" + - "attribute vec4 vPosition;" + - "uniform float uPointSize;" + - "void main() {" + - // the matrix must be included as a modifier of gl_Position - // Note that the uMVPMatrix factor *must be first* in order - // for the matrix multiplication product to be correct. - " gl_Position = uMVPMatrix * vPosition;" + - " gl_PointSize = uPointSize;" + - "}"; - - private static final String fragmentShaderCode = - "precision mediump float;" + - "uniform vec4 vColor;" + - "void main() {" + - " gl_FragColor = vColor;" + - "}"; - - private static final float color[] = {0.63671875f, 0.76953125f, 0.22265625f, 1.0f}; - - private int mPositionHandle; - private int mColorHandle; - private int mMVPMatrixHandle; - private int mPointSizeHandle; - - private float mPointSize = 6.0f; - - public ProgramLandmarks() { - super(vertexShaderCode, fragmentShaderCode); - } - - @Override - protected Drawable2d getDrawable2d() { - return new Drawable2dLandmarks(); - } - - @Override - protected void getLocations() { - // get handle to vertex shader's vPosition member - mPositionHandle = GLES20.glGetAttribLocation(mProgramHandle, "vPosition"); - GlUtil.checkGlError("vPosition"); - // get handle to fragment shader's vColor member - mColorHandle = GLES20.glGetUniformLocation(mProgramHandle, "vColor"); - GlUtil.checkGlError("vColor"); - // get handle to shape's transformation matrix - mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgramHandle, "uMVPMatrix"); - GlUtil.checkGlError("glGetUniformLocation"); - mPointSizeHandle = GLES20.glGetUniformLocation(mProgramHandle, "uPointSize"); - GlUtil.checkGlError("uPointSize"); - } - - @Override - public void drawFrame(int textureId, float[] texMatrix, float[] mvpMatrix) { - // Add program to OpenGL environment - GLES20.glUseProgram(mProgramHandle); - - // Enable a handle to the triangle vertices - GLES20.glEnableVertexAttribArray(mPositionHandle); - - // Prepare the triangle coordinate data - GLES20.glVertexAttribPointer( - mPositionHandle, Drawable2d.COORDS_PER_VERTEX, - GLES20.GL_FLOAT, false, - Drawable2d.VERTEXTURE_STRIDE, mDrawable2d.vertexArray()); - - // Set color for drawing the triangle - GLES20.glUniform4fv(mColorHandle, 1, color, 0); - - // Apply the projection and view transformation - GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMtx, 0); - - GLES20.glUniform1f(mPointSizeHandle, mPointSize); - - // Draw the triangle - GLES20.glDrawArrays(GLES20.GL_POINTS, 0, mDrawable2d.vertexCount()); - - // Disable vertex array - GLES20.glDisableVertexAttribArray(mPositionHandle); - } - - public void drawFrame(int x, int y, int width, int height) { - drawFrame(0, null, null, x, y, width, height); - } - - private final float[] mvpMtx = new float[16]; - private int mCameraType; - private int mOrientation; - private int mWidth; - private int mHeight; - - public void refresh(float[] landmarksData, int width, int height, int orientation, int cameraType) { - if (mWidth != width || mHeight != height || mOrientation != orientation || mCameraType != cameraType) { - float[] orthoMtx = new float[16]; - float[] rotateMtx = new float[16]; - Matrix.orthoM(orthoMtx, 0, 0, width, 0, height, -1, 1); - Matrix.setRotateM(rotateMtx, 0, 360 - orientation, 0.0f, 0.0f, 1.0f); - if (cameraType == Camera.CameraInfo.CAMERA_FACING_BACK) { - Matrix.rotateM(rotateMtx, 0, 180, 1.0f, 0.0f, 0.0f); - } - Matrix.multiplyMM(mvpMtx, 0, rotateMtx, 0, orthoMtx, 0); - - mWidth = width; - mHeight = height; - mOrientation = orientation; - mCameraType = cameraType; - } - - updateVertexArray(Arrays.copyOf(landmarksData, landmarksData.length)); - } -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTexture2d.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTexture2d.java deleted file mode 100644 index 73221726b..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTexture2d.java +++ /dev/null @@ -1,124 +0,0 @@ -/* - * Copyright 2014 Google Inc. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package io.agora.api.example.common.gles; - -import android.opengl.GLES20; - -import io.agora.api.example.common.gles.core.Drawable2d; -import io.agora.api.example.common.gles.core.GlUtil; -import io.agora.api.example.common.gles.core.Program; - - -public class ProgramTexture2d extends Program { - - // Simple vertex shader, used for all programs. - private static final String VERTEX_SHADER = - "uniform mat4 uMVPMatrix;\n" + - "uniform mat4 uTexMatrix;\n" + - "attribute vec4 aPosition;\n" + - "attribute vec4 aTextureCoord;\n" + - "varying vec2 vTextureCoord;\n" + - "void main() {\n" + - " gl_Position = uMVPMatrix * aPosition;\n" + - " vTextureCoord = (uTexMatrix * aTextureCoord).xy;\n" + - "}\n"; - - // Simple fragment shader for use with "normal" 2D textures. - private static final String FRAGMENT_SHADER_2D = - "precision mediump float;\n" + - "varying vec2 vTextureCoord;\n" + - "uniform sampler2D sTexture;\n" + - "void main() {\n" + - " gl_FragColor = vec4(texture2D(sTexture, vTextureCoord).rgb, 1.0);\n" + - "}\n"; - - private int muMVPMatrixLoc; - private int muTexMatrixLoc; - private int maPositionLoc; - private int maTextureCoordLoc; - - public ProgramTexture2d() { - super(VERTEX_SHADER, FRAGMENT_SHADER_2D); - } - - @Override - protected Drawable2d getDrawable2d() { - return new Drawable2dFull(); - } - - @Override - protected void getLocations() { - maPositionLoc = GLES20.glGetAttribLocation(mProgramHandle, "aPosition"); - GlUtil.checkLocation(maPositionLoc, "aPosition"); - maTextureCoordLoc = GLES20.glGetAttribLocation(mProgramHandle, "aTextureCoord"); - GlUtil.checkLocation(maTextureCoordLoc, "aTextureCoord"); - muMVPMatrixLoc = GLES20.glGetUniformLocation(mProgramHandle, "uMVPMatrix"); - GlUtil.checkLocation(muMVPMatrixLoc, "uMVPMatrix"); - muTexMatrixLoc = GLES20.glGetUniformLocation(mProgramHandle, "uTexMatrix"); - GlUtil.checkLocation(muTexMatrixLoc, "uTexMatrix"); - } - - @Override - public void drawFrame(int textureId, float[] texMatrix, float[] mvpMatrix) { - GlUtil.checkGlError("draw start"); - - // Select the program. - GLES20.glUseProgram(mProgramHandle); - GlUtil.checkGlError("glUseProgram"); - - // Set the texture. - GLES20.glActiveTexture(GLES20.GL_TEXTURE0); - GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); - - // Copy the model / view / projection matrix over. - GLES20.glUniformMatrix4fv(muMVPMatrixLoc, 1, false, mvpMatrix, 0); - GlUtil.checkGlError("glUniformMatrix4fv"); - - // Copy the texture transformation matrix over. - GLES20.glUniformMatrix4fv(muTexMatrixLoc, 1, false, texMatrix, 0); - GlUtil.checkGlError("glUniformMatrix4fv"); - - // Enable the "aPosition" vertex attribute. - GLES20.glEnableVertexAttribArray(maPositionLoc); - GlUtil.checkGlError("glEnableVertexAttribArray"); - - // Connect vertexBuffer to "aPosition". - GLES20.glVertexAttribPointer(maPositionLoc, Drawable2d.COORDS_PER_VERTEX, - GLES20.GL_FLOAT, false, Drawable2d.VERTEXTURE_STRIDE, mDrawable2d.vertexArray()); - GlUtil.checkGlError("glVertexAttribPointer"); - - // Enable the "aTextureCoord" vertex attribute. - GLES20.glEnableVertexAttribArray(maTextureCoordLoc); - GlUtil.checkGlError("glEnableVertexAttribArray"); - - // Connect texBuffer to "aTextureCoord". - GLES20.glVertexAttribPointer(maTextureCoordLoc, 2, - GLES20.GL_FLOAT, false, Drawable2d.TEXTURE_COORD_STRIDE, mDrawable2d.texCoordArray()); - GlUtil.checkGlError("glVertexAttribPointer"); - - // Draw the rect. - GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, mDrawable2d.vertexCount()); - GlUtil.checkGlError("glDrawArrays"); - - // Done -- disable vertex array, texture, and program. - GLES20.glDisableVertexAttribArray(maPositionLoc); - GLES20.glDisableVertexAttribArray(maTextureCoordLoc); - GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0); - GLES20.glUseProgram(0); - } - -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTextureOES.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTextureOES.java index 9eaff0787..81f2179b3 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTextureOES.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/ProgramTextureOES.java @@ -24,30 +24,33 @@ import io.agora.api.example.common.gles.core.Program; +/** + * The type Program texture oes. + */ public class ProgramTextureOES extends Program { // Simple vertex shader, used for all programs. private static final String VERTEX_SHADER = - "uniform mat4 uMVPMatrix;\n" + - "uniform mat4 uTexMatrix;\n" + - "attribute vec4 aPosition;\n" + - "attribute vec4 aTextureCoord;\n" + - "varying vec2 vTextureCoord;\n" + - "void main() {\n" + - " gl_Position = uMVPMatrix * aPosition;\n" + - " vTextureCoord = (uTexMatrix * aTextureCoord).xy;\n" + - "}\n"; + "uniform mat4 uMVPMatrix;\n" + + "uniform mat4 uTexMatrix;\n" + + "attribute vec4 aPosition;\n" + + "attribute vec4 aTextureCoord;\n" + + "varying vec2 vTextureCoord;\n" + + "void main() {\n" + + " gl_Position = uMVPMatrix * aPosition;\n" + + " vTextureCoord = (uTexMatrix * aTextureCoord).xy;\n" + + "}\n"; // Simple fragment shader for use with external 2D textures (e.g. what we get from // SurfaceTexture). private static final String FRAGMENT_SHADER_EXT = - "#extension GL_OES_EGL_image_external : require\n" + - "precision mediump float;\n" + - "varying vec2 vTextureCoord;\n" + - "uniform samplerExternalOES sTexture;\n" + - "void main() {\n" + - " gl_FragColor = texture2D(sTexture, vTextureCoord);\n" + - "}\n"; + "#extension GL_OES_EGL_image_external : require\n" + + "precision mediump float;\n" + + "varying vec2 vTextureCoord;\n" + + "uniform samplerExternalOES sTexture;\n" + + "void main() {\n" + + " gl_FragColor = texture2D(sTexture, vTextureCoord);\n" + + "}\n"; private int muMVPMatrixLoc; private int muTexMatrixLoc; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Drawable2d.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Drawable2d.java index c3d2abe68..6179d6f73 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Drawable2d.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Drawable2d.java @@ -23,15 +23,30 @@ */ public class Drawable2d { + /** + * The constant SIZEOF_FLOAT. + */ public static final int SIZEOF_FLOAT = 4; + /** + * The constant COORDS_PER_VERTEX. + */ public static final int COORDS_PER_VERTEX = 2; + /** + * The constant TEXTURE_COORD_STRIDE. + */ public static final int TEXTURE_COORD_STRIDE = 2 * SIZEOF_FLOAT; + /** + * The constant VERTEXTURE_STRIDE. + */ public static final int VERTEXTURE_STRIDE = COORDS_PER_VERTEX * SIZEOF_FLOAT; private FloatBuffer mTexCoordArray; private FloatBuffer mVertexArray; private int mVertexCount; + /** + * Instantiates a new Drawable 2 d. + */ public Drawable2d() { } @@ -39,25 +54,40 @@ public Drawable2d() { * Prepares a drawable from a "pre-fabricated" shape definition. *

    * Does no EGL/GL operations, so this can be done at any time. + * + * @param fullRectangleCoords the full rectangle coords + * @param fullRectangleTexCoords the full rectangle tex coords */ - public Drawable2d(float[] FULL_RECTANGLE_COORDS, float[] FULL_RECTANGLE_TEX_COORDS) { - updateVertexArray(FULL_RECTANGLE_COORDS); - updateTexCoordArray(FULL_RECTANGLE_TEX_COORDS); + public Drawable2d(float[] fullRectangleCoords, float[] fullRectangleTexCoords) { + updateVertexArray(fullRectangleCoords); + updateTexCoordArray(fullRectangleTexCoords); } - public void updateVertexArray(float[] FULL_RECTANGLE_COORDS) { - mVertexArray = GlUtil.createFloatBuffer(FULL_RECTANGLE_COORDS); - mVertexCount = FULL_RECTANGLE_COORDS.length / COORDS_PER_VERTEX; + /** + * Update vertex array. + * + * @param fullRectangleCoords the full rectangle coords + */ + public void updateVertexArray(float[] fullRectangleCoords) { + mVertexArray = GlUtil.createFloatBuffer(fullRectangleCoords); + mVertexCount = fullRectangleCoords.length / COORDS_PER_VERTEX; } - public void updateTexCoordArray(float[] FULL_RECTANGLE_TEX_COORDS) { - mTexCoordArray = GlUtil.createFloatBuffer(FULL_RECTANGLE_TEX_COORDS); + /** + * Update tex coord array. + * + * @param fullRectangleTexCoords the full rectangle tex coords + */ + public void updateTexCoordArray(float[] fullRectangleTexCoords) { + mTexCoordArray = GlUtil.createFloatBuffer(fullRectangleTexCoords); } /** * Returns the array of vertices. *

    * To avoid allocations, this returns internal state. The caller must not modify it. + * + * @return the float buffer */ public FloatBuffer vertexArray() { return mVertexArray; @@ -67,6 +97,8 @@ public FloatBuffer vertexArray() { * Returns the array of texture coordinates. *

    * To avoid allocations, this returns internal state. The caller must not modify it. + * + * @return the float buffer */ public FloatBuffer texCoordArray() { return mTexCoordArray; @@ -74,6 +106,8 @@ public FloatBuffer texCoordArray() { /** * Returns the number of vertices stored in the vertex array. + * + * @return the int */ public int vertexCount() { return mVertexCount; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglCore.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglCore.java index 59ce2f986..7adb5b5b0 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglCore.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglCore.java @@ -141,6 +141,7 @@ public EglCore(EGLContext sharedContext, int flags) { * * @param flags Bit flags from constructor. * @param version Must be 2 or 3. + * @return EGLConfig */ private EGLConfig getConfig(int flags, int version) { int renderableType = EGL14.EGL_OPENGL_ES2_BIT; @@ -214,6 +215,11 @@ protected void finalize() throws Throwable { } } + /** + * Gets egl context. + * + * @return the egl context + */ public EGLContext getEGLContext() { return mEGLContext; } @@ -221,6 +227,8 @@ public EGLContext getEGLContext() { /** * Destroys the specified surface. Note the EGLSurface won't actually be destroyed if it's * still current in a context. + * + * @param eglSurface the egl surface */ public void releaseSurface(EGLSurface eglSurface) { EGL14.eglDestroySurface(mEGLDisplay, eglSurface); @@ -230,6 +238,9 @@ public void releaseSurface(EGLSurface eglSurface) { * Creates an EGL surface associated with a Surface. *

    * If this is destined for MediaCodec, the EGLConfig should have the "recordable" attribute. + * + * @param surface the surface + * @return the egl surface */ public EGLSurface createWindowSurface(Object surface) { if (!(surface instanceof Surface) && !(surface instanceof SurfaceTexture)) { @@ -251,6 +262,10 @@ public EGLSurface createWindowSurface(Object surface) { /** * Creates an EGL surface associated with an offscreen buffer. + * + * @param width the width + * @param height the height + * @return the egl surface */ public EGLSurface createOffscreenSurface(int width, int height) { int[] surfaceAttribs = { @@ -269,6 +284,8 @@ public EGLSurface createOffscreenSurface(int width, int height) { /** * Makes our EGL context current, using the supplied surface for both "draw" and "read". + * + * @param eglSurface the egl surface */ public void makeCurrent(EGLSurface eglSurface) { if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) { @@ -282,6 +299,9 @@ public void makeCurrent(EGLSurface eglSurface) { /** * Makes our EGL context current, using the supplied "draw" and "read" surfaces. + * + * @param drawSurface the draw surface + * @param readSurface the read surface */ public void makeCurrent(EGLSurface drawSurface, EGLSurface readSurface) { if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) { @@ -306,6 +326,7 @@ public void makeNothingCurrent() { /** * Calls eglSwapBuffers. Use this to "publish" the current frame. * + * @param eglSurface the egl surface * @return false on failure */ public boolean swapBuffers(EGLSurface eglSurface) { @@ -314,6 +335,9 @@ public boolean swapBuffers(EGLSurface eglSurface) { /** * Sends the presentation time stamp to EGL. Time is expressed in nanoseconds. + * + * @param eglSurface the egl surface + * @param nsecs the nsecs */ public void setPresentationTime(EGLSurface eglSurface, long nsecs) { EGLExt.eglPresentationTimeANDROID(mEGLDisplay, eglSurface, nsecs); @@ -321,12 +345,20 @@ public void setPresentationTime(EGLSurface eglSurface, long nsecs) { /** * Returns true if our context and the specified surface are current. + * + * @param eglSurface the egl surface + * @return the boolean */ public boolean isCurrent(EGLSurface eglSurface) { - return mEGLContext.equals(EGL14.eglGetCurrentContext()) && - eglSurface.equals(EGL14.eglGetCurrentSurface(EGL14.EGL_DRAW)); + return mEGLContext.equals(EGL14.eglGetCurrentContext()) + && eglSurface.equals(EGL14.eglGetCurrentSurface(EGL14.EGL_DRAW)); } + /** + * Gets current drawing surface. + * + * @return the current drawing surface + */ public EGLSurface getCurrentDrawingSurface() { EGLSurface surface = null; if (mEGLContext.equals(EGL14.eglGetCurrentContext())) { @@ -337,6 +369,10 @@ public EGLSurface getCurrentDrawingSurface() { /** * Performs a simple surface query. + * + * @param eglSurface the egl surface + * @param what the what + * @return the int */ public int querySurface(EGLSurface eglSurface, int what) { int[] value = new int[1]; @@ -346,6 +382,9 @@ public int querySurface(EGLSurface eglSurface, int what) { /** * Queries a string value. + * + * @param what the what + * @return the string */ public String queryString(int what) { return EGL14.eglQueryString(mEGLDisplay, what); @@ -353,6 +392,8 @@ public String queryString(int what) { /** * Returns the GLES version this context is configured for (currently 2 or 3). + * + * @return the gl version */ public int getGlVersion() { return mGlVersion; @@ -360,6 +401,8 @@ public int getGlVersion() { /** * Writes the current display, context, and surface to the log. + * + * @param msg the msg */ public static void logCurrent(String msg) { EGLDisplay display; @@ -369,16 +412,24 @@ public static void logCurrent(String msg) { display = EGL14.eglGetCurrentDisplay(); context = EGL14.eglGetCurrentContext(); surface = EGL14.eglGetCurrentSurface(EGL14.EGL_DRAW); - Log.i(TAG, "Current EGL (" + msg + "): display=" + display + ", context=" + context + - ", surface=" + surface); + Log.i(TAG, "Current EGL (" + + msg + + "): display=" + + display + + ", context=" + + context + + ", surface=" + + surface); } /** * Checks for EGL errors. Throws an exception if an error has been raised. + * + * @param msg error message. */ private void checkEglError(String msg) { - int error; - if ((error = EGL14.eglGetError()) != EGL14.EGL_SUCCESS) { + int error = EGL14.eglGetError(); + if (error != EGL14.EGL_SUCCESS) { throw new RuntimeException(msg + ": EGL error: 0x" + Integer.toHexString(error)); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglSurfaceBase.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglSurfaceBase.java index 98f776ecf..467a227f2 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglSurfaceBase.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/EglSurfaceBase.java @@ -35,15 +35,26 @@ * There can be multiple surfaces associated with a single context. */ public class EglSurfaceBase { + /** + * The constant TAG. + */ protected static final String TAG = GlUtil.TAG; - // EglCore object we're associated with. It may be associated with multiple surfaces. + /** + * The M egl core. + */ +// EglCore object we're associated with. It may be associated with multiple surfaces. protected EglCore mEglCore; private EGLSurface mEGLSurface = EGL14.EGL_NO_SURFACE; private int mWidth = -1; private int mHeight = -1; + /** + * Instantiates a new Egl surface base. + * + * @param eglCore the egl core + */ protected EglSurfaceBase(EglCore eglCore) { mEglCore = eglCore; } @@ -68,6 +79,9 @@ public void createWindowSurface(Object surface) { /** * Creates an off-screen surface. + * + * @param width the width + * @param height the height */ public void createOffscreenSurface(int width, int height) { if (mEGLSurface != EGL14.EGL_NO_SURFACE) { @@ -84,6 +98,8 @@ public void createOffscreenSurface(int width, int height) { * If this is called on a window surface, and the underlying surface is in the process * of changing size, we may not see the new size right away (e.g. in the "surfaceChanged" * callback). The size should match after the next buffer swap. + * + * @return the width */ public int getWidth() { if (mWidth < 0) { @@ -95,6 +111,8 @@ public int getWidth() { /** * Returns the surface's height, in pixels. + * + * @return the height */ public int getHeight() { if (mHeight < 0) { @@ -110,7 +128,8 @@ public int getHeight() { public void releaseEglSurface() { mEglCore.releaseSurface(mEGLSurface); mEGLSurface = EGL14.EGL_NO_SURFACE; - mWidth = mHeight = -1; + mWidth = -1; + mHeight = -1; } /** @@ -123,6 +142,8 @@ public void makeCurrent() { /** * Makes our EGL context and surface current for drawing, using the supplied surface * for reading. + * + * @param readSurface the read surface */ public void makeCurrentReadFrom(EglSurfaceBase readSurface) { mEglCore.makeCurrent(mEGLSurface, readSurface.mEGLSurface); @@ -154,6 +175,9 @@ public void setPresentationTime(long nsecs) { * Saves the EGL surface to a file. *

    * Expects that this object's EGL surface is current. + * + * @param file the file + * @throws IOException the io exception */ public void saveFrame(File file) throws IOException { if (!mEglCore.isCurrent(mEGLSurface)) { @@ -191,7 +215,9 @@ public void saveFrame(File file) throws IOException { bmp.compress(Bitmap.CompressFormat.PNG, 90, bos); bmp.recycle(); } finally { - if (bos != null) bos.close(); + if (bos != null) { + bos.close(); + } } Log.d(TAG, "Saved " + width + "x" + height + " frame as '" + filename + "'"); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Extensions.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Extensions.java index 06f33b1cd..40b353239 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Extensions.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Extensions.java @@ -6,8 +6,17 @@ import java.io.IOException; import java.io.InputStream; +/** + * The type Extensions. + */ public abstract class Extensions { + /** + * Get bytes byte [ ]. + * + * @param inputStream the input stream + * @return the byte [ ] + */ public static byte[] getBytes(InputStream inputStream) { try { byte[] bytes = new byte[inputStream.available()]; @@ -21,6 +30,13 @@ public static byte[] getBytes(InputStream inputStream) { return new byte[0]; } + /** + * Get bytes byte [ ]. + * + * @param assetManager the asset manager + * @param fileName the file name + * @return the byte [ ] + */ public static byte[] getBytes(AssetManager assetManager, String fileName) { try { return getBytes(assetManager.open(fileName)); @@ -31,6 +47,13 @@ public static byte[] getBytes(AssetManager assetManager, String fileName) { return new byte[0]; } + /** + * Read text file from resource string. + * + * @param context the context + * @param resourceId the resource id + * @return the string + */ public static String readTextFileFromResource(Context context, int resourceId) { return new String(Extensions.getBytes(context.getResources().openRawResource(resourceId))); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/GlUtil.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/GlUtil.java index 13cb63889..d45d903d4 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/GlUtil.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/GlUtil.java @@ -32,7 +32,10 @@ * Some OpenGL utility functions. */ public abstract class GlUtil { - //public static final String TAG = "Grafika"; + /** + * The constant TAG. + */ +//public static final String TAG = "Grafika"; public static final String TAG = "mqi"; /** * Identity matrix for general use. Don't modify or life will get weird. @@ -53,6 +56,8 @@ private GlUtil() { /** * Creates a new program from the supplied vertex and fragment shaders. * + * @param vertexSource the vertex source + * @param fragmentSource the fragment source * @return A handle to the program, or 0 on failure. */ public static int createProgram(String vertexSource, String fragmentSource) { @@ -89,6 +94,8 @@ public static int createProgram(String vertexSource, String fragmentSource) { /** * Compiles the provided shader source. * + * @param shaderType the shader type + * @param source the source * @return A handle to the shader, or 0 on failure. */ public static int loadShader(int shaderType, String source) { @@ -109,6 +116,8 @@ public static int loadShader(int shaderType, String source) { /** * Checks to see if a GLES error has been raised. + * + * @param op the op */ public static void checkGlError(String op) { int error = GLES20.glGetError(); @@ -123,6 +132,9 @@ public static void checkGlError(String op) { * could not be found, but does not set the GL error. *

    * Throws a RuntimeException if the location is invalid. + * + * @param location the location + * @param label the label */ public static void checkLocation(int location, String label) { if (location < 0) { @@ -200,6 +212,9 @@ public static int createImageTexture(Bitmap bmp) { /** * Allocates a direct float buffer, and populates it with the float array data. + * + * @param coords the coords + * @return the float buffer */ public static FloatBuffer createFloatBuffer(float[] coords) { // Allocate a direct ByteBuffer, using 4 bytes per float, and copy coords into it. @@ -236,6 +251,9 @@ public static void logVersionInfo() { * Creates a texture object suitable for use with this program. *

    * On exit, the texture will be bound. + * + * @param textureTarget the texture target + * @return the int */ public static int createTextureObject(int textureTarget) { int[] textures = new int[1]; @@ -259,6 +277,11 @@ public static int createTextureObject(int textureTarget) { return texId; } + /** + * Delete texture object. + * + * @param textureId the texture id + */ public static void deleteTextureObject(int textureId) { int[] textures = new int[1]; textures[0] = textureId; @@ -266,6 +289,16 @@ public static void deleteTextureObject(int textureId) { GlUtil.checkGlError("glDeleteTextures"); } + /** + * Change mvp matrix float [ ]. + * + * @param mvpMatrix the mvp matrix + * @param viewWidth the view width + * @param viewHeight the view height + * @param textureWidth the texture width + * @param textureHeight the texture height + * @return the float [ ] + */ public static float[] changeMVPMatrix(float[] mvpMatrix, float viewWidth, float viewHeight, float textureWidth, float textureHeight) { float scale = viewWidth * textureHeight / viewHeight / textureWidth; if (scale == 1) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/OffscreenSurface.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/OffscreenSurface.java deleted file mode 100644 index 447a1416c..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/OffscreenSurface.java +++ /dev/null @@ -1,39 +0,0 @@ -/* - * Copyright 2013 Google Inc. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package io.agora.api.example.common.gles.core; - -/** - * Off-screen EGL surface (pbuffer). - *

    - * It's good practice to explicitly release() the surface, preferably from a "finally" block. - */ -public class OffscreenSurface extends EglSurfaceBase { - /** - * Creates an off-screen surface with the specified width and height. - */ - public OffscreenSurface(EglCore eglCore, int width, int height) { - super(eglCore); - createOffscreenSurface(width, height); - } - - /** - * Releases any resources associated with the surface. - */ - public void release() { - releaseEglSurface(); - } -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Program.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Program.java index 537e5d18d..36ea0b11d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Program.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/Program.java @@ -6,37 +6,67 @@ /** * Created by tujh on 2018/1/24. */ - public abstract class Program { private static final String TAG = GlUtil.TAG; - // Handles to the GL program and various components of it. + /** + * The M program handle. + */ +// Handles to the GL program and various components of it. protected int mProgramHandle; + /** + * The M drawable 2 d. + */ protected Drawable2d mDrawable2d; /** * Prepares the program in the current EGL context. + * + * @param vertexShader the vertex shader + * @param fragmentShader2D the fragment shader 2 d */ - public Program(String VERTEX_SHADER, String FRAGMENT_SHADER_2D) { - mProgramHandle = GlUtil.createProgram(VERTEX_SHADER, FRAGMENT_SHADER_2D); + public Program(String vertexShader, String fragmentShader2D) { + mProgramHandle = GlUtil.createProgram(vertexShader, fragmentShader2D); mDrawable2d = getDrawable2d(); getLocations(); } + /** + * Instantiates a new Program. + * + * @param context the context + * @param vertexShaderResourceId the vertex shader resource id + * @param fragmentShaderResourceId the fragment shader resource id + */ public Program(Context context, int vertexShaderResourceId, int fragmentShaderResourceId) { this(Extensions.readTextFileFromResource(context, vertexShaderResourceId), Extensions.readTextFileFromResource(context, fragmentShaderResourceId)); } - public void updateVertexArray(float[] FULL_RECTANGLE_COORDS) { - mDrawable2d.updateVertexArray(FULL_RECTANGLE_COORDS); + /** + * Update vertex array. + * + * @param fullRectangleCoords the full rectangle coords + */ + public void updateVertexArray(float[] fullRectangleCoords) { + mDrawable2d.updateVertexArray(fullRectangleCoords); } - public void updateTexCoordArray(float[] FULL_RECTANGLE_TEX_COORDS) { - mDrawable2d.updateTexCoordArray(FULL_RECTANGLE_TEX_COORDS); + /** + * Update tex coord array. + * + * @param fullRectangleTexCoords the full rectangle tex coords + */ + public void updateTexCoordArray(float[] fullRectangleTexCoords) { + mDrawable2d.updateTexCoordArray(fullRectangleTexCoords); } + /** + * Gets drawable 2 d. + * + * @return the drawable 2 d + */ protected abstract Drawable2d getDrawable2d(); /** @@ -46,13 +76,34 @@ public void updateTexCoordArray(float[] FULL_RECTANGLE_TEX_COORDS) { /** * Issues the draw call. Does the full setup on every call. + * + * @param textureId the texture id + * @param texMatrix the tex matrix + * @param mvpMatrix the mvp matrix */ public abstract void drawFrame(int textureId, float[] texMatrix, float[] mvpMatrix); + /** + * Draw frame. + * + * @param textureId the texture id + * @param texMatrix the tex matrix + */ public void drawFrame(int textureId, float[] texMatrix) { drawFrame(textureId, texMatrix, GlUtil.IDENTITY_MATRIX); } + /** + * Draw frame. + * + * @param textureId the texture id + * @param texMatrix the tex matrix + * @param mvpMatrix the mvp matrix + * @param x the x + * @param y the y + * @param width the width + * @param height the height + */ public void drawFrame(int textureId, float[] texMatrix, float[] mvpMatrix, int x, int y, int width, int height) { int[] originalViewport = new int[4]; GLES20.glGetIntegerv(GLES20.GL_VIEWPORT, originalViewport, 0); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/WindowSurface.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/WindowSurface.java deleted file mode 100644 index 2c784f6ab..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/gles/core/WindowSurface.java +++ /dev/null @@ -1,95 +0,0 @@ -/* - * Copyright 2013 Google Inc. All rights reserved. - * - * Licensed under the Apache License, Version 2.0 (the "License"); - * you may not use this file except in compliance with the License. - * You may obtain a copy of the License at - * - * http://www.apache.org/licenses/LICENSE-2.0 - * - * Unless required by applicable law or agreed to in writing, software - * distributed under the License is distributed on an "AS IS" BASIS, - * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. - * See the License for the specific language governing permissions and - * limitations under the License. - */ - -package io.agora.api.example.common.gles.core; - -import android.graphics.SurfaceTexture; -import android.view.Surface; - -/** - * Recordable EGL window surface. - *

    - * It's good practice to explicitly release() the surface, preferably from a "finally" block. - */ -public class WindowSurface extends EglSurfaceBase { - private Surface mSurface; - private boolean mReleaseSurface; - - /** - * Associates an EGL surface with the native window surface. - *

    - * Set releaseSurface to true if you want the Surface to be released when release() is - * called. This is convenient, but can interfere with framework classes that expect to - * manage the Surface themselves (e.g. if you release a SurfaceView's Surface, the - * surfaceDestroyed() callback won't fire). - */ - public WindowSurface(EglCore eglCore, Surface surface, boolean releaseSurface) { - super(eglCore); - createWindowSurface(surface); - mSurface = surface; - mReleaseSurface = releaseSurface; - } - - /** - * Associates an EGL surface with the SurfaceTexture. - */ - public WindowSurface(EglCore eglCore, SurfaceTexture surfaceTexture) { - super(eglCore); - createWindowSurface(surfaceTexture); - } - - public WindowSurface(EglCore eglCore, int width, int height) { - super(eglCore); - createOffscreenSurface(width, height); - } - - /** - * Releases any resources associated with the EGL surface (and, if configured to do so, - * with the Surface as well). - *

    - * Does not require that the surface's EGL context be current. - */ - public void release() { - releaseEglSurface(); - if (mSurface != null) { - if (mReleaseSurface) { - mSurface.release(); - } - mSurface = null; - } - } - - /** - * Recreate the EGLSurface, using the new EglBase. The caller should have already - * freed the old EGLSurface with releaseEglSurface(). - *

    - * This is useful when we want to update the EGLSurface associated with a Surface. - * For example, if we want to share with a different EGLContext, which can only - * be done by tearing down and recreating the context. (That's handled by the caller; - * this just creates a new EGLSurface for the Surface we were handed earlier.) - *

    - * If the previous EGLSurface isn't fully destroyed, e.g. it's still current on a - * context somewhere, the create call will fail with complaints from the Surface - * about already being connected. - */ - public void recreate(EglCore newEglCore) { - if (mSurface == null) { - throw new RuntimeException("not yet implemented for SurfaceTexture"); - } - mEglCore = newEglCore; // switch to new context - createWindowSurface(mSurface); // create new surface - } -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/ExampleBean.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/ExampleBean.java index 331ae96a7..2914f366d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/ExampleBean.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/ExampleBean.java @@ -4,6 +4,8 @@ import android.os.Parcelable; /** + * The type Example bean. + * * @author cjw */ public class ExampleBean implements Parcelable { @@ -13,6 +15,15 @@ public class ExampleBean implements Parcelable { private int actionId; private int tipsId; + /** + * Instantiates a new Example bean. + * + * @param index the index + * @param group the group + * @param name the name + * @param actionId the action id + * @param tipsId the tips id + */ public ExampleBean(int index, String group, int name, int actionId, int tipsId) { this.index = index; this.group = group; @@ -21,42 +32,92 @@ public ExampleBean(int index, String group, int name, int actionId, int tipsId) this.tipsId = tipsId; } + /** + * Gets index. + * + * @return the index + */ public int getIndex() { return index; } + /** + * Sets index. + * + * @param index the index + */ public void setIndex(int index) { this.index = index; } + /** + * Gets group. + * + * @return the group + */ public String getGroup() { return group; } + /** + * Sets group. + * + * @param group the group + */ public void setGroup(String group) { this.group = group; } + /** + * Gets name. + * + * @return the name + */ public int getName() { return name; } + /** + * Sets name. + * + * @param name the name + */ public void setName(int name) { this.name = name; } + /** + * Gets action id. + * + * @return the action id + */ public int getActionId() { return actionId; } + /** + * Sets action id. + * + * @param actionId the action id + */ public void setActionId(int actionId) { this.actionId = actionId; } + /** + * Gets tips id. + * + * @return the tips id + */ public int getTipsId() { return tipsId; } + /** + * Sets tips id. + * + * @param tipsId the tips id + */ public void setTipsId(int tipsId) { this.tipsId = tipsId; } @@ -75,9 +136,17 @@ public void writeToParcel(Parcel dest, int flags) { dest.writeInt(this.tipsId); } + /** + * Instantiates a new Example bean. + */ public ExampleBean() { } + /** + * Instantiates a new Example bean. + * + * @param in the in + */ protected ExampleBean(Parcel in) { this.group = in.readString(); this.name = in.readInt(); @@ -85,6 +154,9 @@ protected ExampleBean(Parcel in) { this.tipsId = in.readInt(); } + /** + * The constant CREATOR. + */ public static final Creator CREATOR = new Creator() { @Override public ExampleBean createFromParcel(Parcel source) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Examples.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Examples.java index 80ec5092b..a2d1332d7 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Examples.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Examples.java @@ -11,12 +11,33 @@ import io.agora.api.example.annotation.Example; -public class Examples { +/** + * The type Examples. + */ +public final class Examples { + /** + * The constant BASIC. + */ public static final String BASIC = "BASIC"; + /** + * The constant ADVANCED. + */ public static final String ADVANCED = "ADVANCED"; + /** + * The constant ITEM_MAP. + */ public static final Map> ITEM_MAP = new HashMap<>(); + private Examples() { + + } + + /** + * Add item. + * + * @param item the item + */ public static void addItem(@NonNull Example item) { String group = item.group(); List list = ITEM_MAP.get(group); @@ -27,6 +48,9 @@ public static void addItem(@NonNull Example item) { list.add(item); } + /** + * Sort item. + */ public static void sortItem() { for (Map.Entry> entry : ITEM_MAP.entrySet()) { List exampleList = ITEM_MAP.get(entry.getKey()); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/GlobalSettings.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/GlobalSettings.java index ca711d0eb..7fdca98fd 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/GlobalSettings.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/GlobalSettings.java @@ -15,18 +15,39 @@ import io.agora.rtc2.proxy.LocalAccessPointConfiguration; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Global settings. + */ public class GlobalSettings { private String videoEncodingDimension; private String videoEncodingFrameRate; private String videoEncodingOrientation; private String areaCodeStr = "GLOBAL"; - // private cloud config + /** + * The Private cloud ip. + */ +// private cloud config public String privateCloudIp = ""; + /** + * The Private cloud log report enable. + */ public boolean privateCloudLogReportEnable = false; + /** + * The Private cloud log server domain. + */ public String privateCloudLogServerDomain = ""; + /** + * The Private cloud log server port. + */ public int privateCloudLogServerPort = 80; + /** + * The Private cloud log server path. + */ public String privateCloudLogServerPath = ""; + /** + * The Private cloud use https. + */ public boolean privateCloudUseHttps = false; // public String privateCloudIp = "10.62.0.85"; // public boolean privateCloudLogReportEnable = true; @@ -35,16 +56,27 @@ public class GlobalSettings { // public String privateCloudLogServerPath = "/kafka/log/upload/v1"; // public boolean privateCloudUseHttps = true; + /** + * Gets video encoding dimension. + * + * @return the video encoding dimension + */ public String getVideoEncodingDimension() { - if (videoEncodingDimension == null) + if (videoEncodingDimension == null) { return "VD_960x540"; - else + } else { return videoEncodingDimension; + } } + /** + * Gets private cloud config. + * + * @return the private cloud config + */ public LocalAccessPointConfiguration getPrivateCloudConfig() { LocalAccessPointConfiguration config = new LocalAccessPointConfiguration(); - if(TextUtils.isEmpty(privateCloudIp)){ + if (TextUtils.isEmpty(privateCloudIp)) { return null; } config.ipList = new ArrayList<>(); @@ -65,6 +97,11 @@ public LocalAccessPointConfiguration getPrivateCloudConfig() { return config; } + /** + * Gets video encoding dimension object. + * + * @return the video encoding dimension object + */ public VideoEncoderConfiguration.VideoDimensions getVideoEncodingDimensionObject() { VideoEncoderConfiguration.VideoDimensions value = VD_960x540; try { @@ -79,40 +116,83 @@ public VideoEncoderConfiguration.VideoDimensions getVideoEncodingDimensionObject return value; } + /** + * Sets video encoding dimension. + * + * @param videoEncodingDimension the video encoding dimension + */ public void setVideoEncodingDimension(String videoEncodingDimension) { this.videoEncodingDimension = videoEncodingDimension; } + /** + * Gets video encoding frame rate. + * + * @return the video encoding frame rate + */ public String getVideoEncodingFrameRate() { - if (videoEncodingFrameRate == null) + if (videoEncodingFrameRate == null) { return FRAME_RATE_FPS_15.name(); - else + } else { return videoEncodingFrameRate; + } } + /** + * Sets video encoding frame rate. + * + * @param videoEncodingFrameRate the video encoding frame rate + */ public void setVideoEncodingFrameRate(String videoEncodingFrameRate) { this.videoEncodingFrameRate = videoEncodingFrameRate; } + /** + * Gets video encoding orientation. + * + * @return the video encoding orientation + */ public String getVideoEncodingOrientation() { - if (videoEncodingOrientation == null) + if (videoEncodingOrientation == null) { return ORIENTATION_MODE_ADAPTIVE.name(); - else + } else { return videoEncodingOrientation; + } + } + /** + * Sets video encoding orientation. + * + * @param videoEncodingOrientation the video encoding orientation + */ public void setVideoEncodingOrientation(String videoEncodingOrientation) { this.videoEncodingOrientation = videoEncodingOrientation; } + /** + * Gets area code str. + * + * @return the area code str + */ public String getAreaCodeStr() { return areaCodeStr; } + /** + * Sets area code str. + * + * @param areaCodeStr the area code str + */ public void setAreaCodeStr(String areaCodeStr) { this.areaCodeStr = areaCodeStr; } + /** + * Gets area code. + * + * @return the area code + */ public int getAreaCode() { if ("CN".equals(areaCodeStr)) { return RtcEngineConfig.AreaCode.AREA_CODE_CN; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Peer.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Peer.java deleted file mode 100644 index 676f7c7ec..000000000 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/Peer.java +++ /dev/null @@ -1,16 +0,0 @@ -package io.agora.api.example.common.model; - -import java.nio.ByteBuffer; - -/** - * Created by wyylling@gmail.com on 03/01/2018. - */ - -public class Peer { - public int uid; - public ByteBuffer data; - public int width; - public int height; - public int rotation; - public long ts; -} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java index 333ceedcd..41498800a 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/model/StatisticsInfo.java @@ -1,40 +1,74 @@ package io.agora.api.example.common.model; -import io.agora.rtc2.IRtcEngineEventHandler.*; +import io.agora.rtc2.IRtcEngineEventHandler; + +/** + * The type Statistics info. + */ public class StatisticsInfo { - private LocalVideoStats localVideoStats = new LocalVideoStats(); - private LocalAudioStats localAudioStats = new LocalAudioStats(); - private RemoteVideoStats remoteVideoStats = new RemoteVideoStats(); - private RemoteAudioStats remoteAudioStats = new RemoteAudioStats(); - private RtcStats rtcStats = new RtcStats(); + private IRtcEngineEventHandler.LocalVideoStats localVideoStats = new IRtcEngineEventHandler.LocalVideoStats(); + private IRtcEngineEventHandler.LocalAudioStats localAudioStats = new IRtcEngineEventHandler.LocalAudioStats(); + private IRtcEngineEventHandler.RemoteVideoStats remoteVideoStats = new IRtcEngineEventHandler.RemoteVideoStats(); + private IRtcEngineEventHandler.RemoteAudioStats remoteAudioStats = new IRtcEngineEventHandler.RemoteAudioStats(); + private IRtcEngineEventHandler.RtcStats rtcStats = new IRtcEngineEventHandler.RtcStats(); private int quality; - private LastmileProbeResult lastMileProbeResult; + private IRtcEngineEventHandler.LastmileProbeResult lastMileProbeResult; - public void setLocalVideoStats(LocalVideoStats localVideoStats) { + /** + * Sets local video stats. + * + * @param localVideoStats the local video stats + */ + public void setLocalVideoStats(IRtcEngineEventHandler.LocalVideoStats localVideoStats) { this.localVideoStats = localVideoStats; } - public void setLocalAudioStats(LocalAudioStats localAudioStats) { + /** + * Sets local audio stats. + * + * @param localAudioStats the local audio stats + */ + public void setLocalAudioStats(IRtcEngineEventHandler.LocalAudioStats localAudioStats) { this.localAudioStats = localAudioStats; } - public void setRemoteVideoStats(RemoteVideoStats remoteVideoStats) { + /** + * Sets remote video stats. + * + * @param remoteVideoStats the remote video stats + */ + public void setRemoteVideoStats(IRtcEngineEventHandler.RemoteVideoStats remoteVideoStats) { this.remoteVideoStats = remoteVideoStats; } - public void setRemoteAudioStats(RemoteAudioStats remoteAudioStats) { + /** + * Sets remote audio stats. + * + * @param remoteAudioStats the remote audio stats + */ + public void setRemoteAudioStats(IRtcEngineEventHandler.RemoteAudioStats remoteAudioStats) { this.remoteAudioStats = remoteAudioStats; } - public void setRtcStats(RtcStats rtcStats) { + /** + * Sets rtc stats. + * + * @param rtcStats the rtc stats + */ + public void setRtcStats(IRtcEngineEventHandler.RtcStats rtcStats) { this.rtcStats = rtcStats; } + /** + * Gets local video stats. + * + * @return the local video stats + */ public String getLocalVideoStats() { StringBuilder builder = new StringBuilder(); return builder - .append(""+localVideoStats.encodedFrameWidth) + .append("" + localVideoStats.encodedFrameWidth) .append("脳") .append(localVideoStats.encodedFrameHeight) .append(",") @@ -65,6 +99,11 @@ public String getLocalVideoStats() { .toString(); } + /** + * Gets remote video stats. + * + * @return the remote video stats + */ public String getRemoteVideoStats() { StringBuilder builder = new StringBuilder(); return builder @@ -96,12 +135,22 @@ public String getRemoteVideoStats() { .toString(); } + /** + * Sets last mile quality. + * + * @param quality the quality + */ public void setLastMileQuality(int quality) { this.quality = quality; } - public String getLastMileQuality(){ - switch (quality){ + /** + * Get last mile quality string. + * + * @return the string + */ + public String getLastMileQuality() { + switch (quality) { case 1: return "EXCELLENT"; case 2: @@ -123,9 +172,15 @@ public String getLastMileQuality(){ } } + /** + * Gets last mile result. + * + * @return the last mile result + */ public String getLastMileResult() { - if(lastMileProbeResult == null) + if (lastMileProbeResult == null) { return null; + } StringBuilder stringBuilder = new StringBuilder(); stringBuilder.append("Rtt: ") .append(lastMileProbeResult.rtt) @@ -157,7 +212,12 @@ public String getLastMileResult() { return stringBuilder.toString(); } - public void setLastMileProbeResult(LastmileProbeResult lastmileProbeResult) { + /** + * Sets last mile probe result. + * + * @param lastmileProbeResult the lastmile probe result + */ + public void setLastMileProbeResult(IRtcEngineEventHandler.LastmileProbeResult lastmileProbeResult) { this.lastMileProbeResult = lastmileProbeResult; } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioOnlyLayout.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioOnlyLayout.java index 30c60870c..98129782b 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioOnlyLayout.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioOnlyLayout.java @@ -16,40 +16,72 @@ import io.agora.api.example.R; +/** + * The type Audio only layout. + */ public class AudioOnlyLayout extends FrameLayout { private TextView tvUserType, tvUserId; private TableLayout tlState; + /** + * Instantiates a new Audio only layout. + * + * @param context the context + */ public AudioOnlyLayout(@NonNull Context context) { this(context, null); } + /** + * Instantiates a new Audio only layout. + * + * @param context the context + * @param attrs the attrs + */ public AudioOnlyLayout(@NonNull Context context, @Nullable AttributeSet attrs) { this(context, attrs, 0); } + /** + * Instantiates a new Audio only layout. + * + * @param context the context + * @param attrs the attrs + * @param defStyleAttr the def style attr + */ public AudioOnlyLayout(@NonNull Context context, @Nullable AttributeSet attrs, int defStyleAttr) { super(context, attrs, defStyleAttr); initView(); } - private void initView(){ + private void initView() { View rootView = View.inflate(getContext(), R.layout.widget_audio_only_layout, this); tvUserType = rootView.findViewById(R.id.tv_user_type); tvUserId = rootView.findViewById(R.id.tv_user_id); tlState = rootView.findViewById(R.id.table_layout_state); } - public void updateUserInfo(String uid, boolean isLocal){ + /** + * Update user info. + * + * @param uid the uid + * @param isLocal the is local + */ + public void updateUserInfo(String uid, boolean isLocal) { tvUserId.setText(uid + ""); - tvUserType.setText(isLocal ? "Local": "Remote"); + tvUserType.setText(isLocal ? "Local" : "Remote"); } - public void updateStats(Map states){ + /** + * Update stats. + * + * @param states the states + */ + public void updateStats(Map states) { tlState.removeAllViews(); - if(states == null || states.size() <= 0){ + if (states == null || states.size() <= 0) { return; } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java index 05bd94dea..0e1292787 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/AudioSeatManager.java @@ -2,17 +2,32 @@ import android.view.View; +import java.util.ArrayList; + +/** + * The type Audio seat manager. + */ public class AudioSeatManager { private final AudioOnlyLayout[] audioOnlyLayouts; - public AudioSeatManager(AudioOnlyLayout... seats){ + /** + * Instantiates a new Audio seat manager. + * + * @param seats the seats + */ + public AudioSeatManager(AudioOnlyLayout... seats) { audioOnlyLayouts = new AudioOnlyLayout[seats.length]; for (int i = 0; i < audioOnlyLayouts.length; i++) { audioOnlyLayouts[i] = seats[i]; } } + /** + * Up local seat. + * + * @param uid the uid + */ public void upLocalSeat(int uid) { AudioOnlyLayout localSeat = audioOnlyLayouts[0]; localSeat.setTag(uid); @@ -20,7 +35,12 @@ public void upLocalSeat(int uid) { localSeat.updateUserInfo(uid + "", true); } - public void upRemoteSeat(int uid){ + /** + * Up remote seat. + * + * @param uid the uid + */ + public void upRemoteSeat(int uid) { AudioOnlyLayout idleSeat = null; for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { if (audioOnlyLayout.getTag() == null) { @@ -28,37 +48,70 @@ public void upRemoteSeat(int uid){ break; } } - if(idleSeat != null){ + if (idleSeat != null) { idleSeat.setTag(uid); idleSeat.setVisibility(View.VISIBLE); idleSeat.updateUserInfo(uid + "", false); } } - public void downSeat(int uid){ + /** + * Get seat remote uid list array list. + * + * @return the array list + */ + public ArrayList getSeatRemoteUidList() { + ArrayList uidList = new ArrayList<>(); + for (int i = 1; i < audioOnlyLayouts.length; i++) { + AudioOnlyLayout audioOnlyLayout = audioOnlyLayouts[i]; + Object tag = audioOnlyLayout.getTag(); + if (tag instanceof Integer) { + uidList.add((Integer) tag); + } + } + return uidList; + } + + /** + * Down seat. + * + * @param uid the uid + */ + public void downSeat(int uid) { AudioOnlyLayout seat = null; for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { Object tag = audioOnlyLayout.getTag(); - if (tag instanceof Integer && (Integer)tag == uid) { + if (tag instanceof Integer && (Integer) tag == uid) { seat = audioOnlyLayout; break; } } - if(seat != null){ + if (seat != null) { seat.setTag(null); seat.setVisibility(View.INVISIBLE); } } - public AudioOnlyLayout getLocalSeat(){ + /** + * Get local seat audio only layout. + * + * @return the audio only layout + */ + public AudioOnlyLayout getLocalSeat() { return audioOnlyLayouts[0]; } - public AudioOnlyLayout getRemoteSeat(int uid){ + /** + * Get remote seat audio only layout. + * + * @param uid the uid + * @return the audio only layout + */ + public AudioOnlyLayout getRemoteSeat(int uid) { AudioOnlyLayout seat = null; for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { Object tag = audioOnlyLayout.getTag(); - if (tag instanceof Integer && (Integer)tag == uid) { + if (tag instanceof Integer && (Integer) tag == uid) { seat = audioOnlyLayout; break; } @@ -66,7 +119,10 @@ public AudioOnlyLayout getRemoteSeat(int uid){ return seat; } - public void downAllSeats(){ + /** + * Down all seats. + */ + public void downAllSeats() { for (AudioOnlyLayout audioOnlyLayout : audioOnlyLayouts) { audioOnlyLayout.setTag(null); audioOnlyLayout.setVisibility(View.INVISIBLE); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/VideoReportLayout.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/VideoReportLayout.java index 16986f1c7..46fcace19 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/VideoReportLayout.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/VideoReportLayout.java @@ -16,20 +16,41 @@ import io.agora.api.example.common.model.StatisticsInfo; import io.agora.rtc2.IRtcEngineEventHandler; +/** + * The type Video report layout. + */ public class VideoReportLayout extends FrameLayout { private final StatisticsInfo statisticsInfo = new StatisticsInfo(); private TextView reportTextView; private int reportUid = -1; + /** + * Instantiates a new Video report layout. + * + * @param context the context + */ public VideoReportLayout(@NonNull Context context) { super(context); } + /** + * Instantiates a new Video report layout. + * + * @param context the context + * @param attrs the attrs + */ public VideoReportLayout(@NonNull Context context, @Nullable AttributeSet attrs) { super(context, attrs); } + /** + * Instantiates a new Video report layout. + * + * @param context the context + * @param attrs the attrs + * @param defStyleAttr the def style attr + */ public VideoReportLayout(@NonNull Context context, @Nullable AttributeSet attrs, int defStyleAttr) { super(context, attrs, defStyleAttr); } @@ -56,25 +77,46 @@ public void onViewDetachedFromWindow(View v) { }); reportTextView.setTextColor(Color.parseColor("#eeeeee")); LayoutParams reportParams = new LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT, ViewGroup.LayoutParams.WRAP_CONTENT); - reportParams.topMargin = reportParams.leftMargin = 16; + reportParams.topMargin = 16; + reportParams.leftMargin = 16; addView(reportTextView, reportParams); } } + /** + * Sets report uid. + * + * @param uid the uid + */ public void setReportUid(int uid) { this.reportUid = uid; } + /** + * Gets report uid. + * + * @return the report uid + */ public int getReportUid() { return reportUid; } - public void setLocalAudioStats(IRtcEngineEventHandler.LocalAudioStats stats){ + /** + * Set local audio stats. + * + * @param stats the stats + */ + public void setLocalAudioStats(IRtcEngineEventHandler.LocalAudioStats stats) { statisticsInfo.setLocalAudioStats(stats); setReportText(statisticsInfo.getLocalVideoStats()); } - public void setLocalVideoStats(IRtcEngineEventHandler.LocalVideoStats stats){ + /** + * Set local video stats. + * + * @param stats the stats + */ + public void setLocalVideoStats(IRtcEngineEventHandler.LocalVideoStats stats) { if (stats.uid != reportUid) { return; } @@ -82,7 +124,12 @@ public void setLocalVideoStats(IRtcEngineEventHandler.LocalVideoStats stats){ setReportText(statisticsInfo.getLocalVideoStats()); } - public void setRemoteAudioStats(IRtcEngineEventHandler.RemoteAudioStats stats){ + /** + * Set remote audio stats. + * + * @param stats the stats + */ + public void setRemoteAudioStats(IRtcEngineEventHandler.RemoteAudioStats stats) { if (stats.uid != reportUid) { return; } @@ -90,7 +137,12 @@ public void setRemoteAudioStats(IRtcEngineEventHandler.RemoteAudioStats stats){ setReportText(statisticsInfo.getRemoteVideoStats()); } - public void setRemoteVideoStats(IRtcEngineEventHandler.RemoteVideoStats stats){ + /** + * Set remote video stats. + * + * @param stats the stats + */ + public void setRemoteVideoStats(IRtcEngineEventHandler.RemoteVideoStats stats) { if (stats.uid != reportUid) { return; } @@ -100,9 +152,9 @@ public void setRemoteVideoStats(IRtcEngineEventHandler.RemoteVideoStats stats){ private void setReportText(String reportText) { - if(reportTextView != null){ + if (reportTextView != null) { reportTextView.post(() -> { - if(reportTextView != null){ + if (reportTextView != null) { reportTextView.setText(reportText); } }); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/WaveformView.java b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/WaveformView.java index 839ebb022..6329cea8a 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/WaveformView.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/common/widget/WaveformView.java @@ -16,12 +16,15 @@ import io.agora.api.example.R; +/** + * The type Waveform view. + */ public class WaveformView extends View { private ArrayList datas = new ArrayList<>(); private short max = 100; private float mWidth; private float mHeight; - private float space =1f; + private float space = 1f; private Paint mWavePaint; private Paint baseLinePaint; private int mWaveColor = Color.WHITE; @@ -31,14 +34,32 @@ public class WaveformView extends View { private long drawTime; private boolean isMaxConstant = false; + /** + * Instantiates a new Waveform view. + * + * @param context the context + */ public WaveformView(Context context) { this(context, null); } + /** + * Instantiates a new Waveform view. + * + * @param context the context + * @param attrs the attrs + */ public WaveformView(Context context, @Nullable AttributeSet attrs) { this(context, attrs, 0); } + /** + * Instantiates a new Waveform view. + * + * @param context the context + * @param attrs the attrs + * @param defStyleAttr the def style attr + */ public WaveformView(Context context, @Nullable AttributeSet attrs, int defStyleAttr) { super(context, attrs, defStyleAttr); init(attrs, defStyleAttr); @@ -69,8 +90,8 @@ private void init(AttributeSet attrs, int defStyle) { private void initPainters() { mWavePaint = new Paint(); - mWavePaint.setColor(mWaveColor);// 鐢荤瑪涓篶olor - mWavePaint.setStrokeWidth(waveStrokeWidth);// 璁剧疆鐢荤瑪绮楃粏 + mWavePaint.setColor(mWaveColor); // 鐢荤瑪涓篶olor + mWavePaint.setStrokeWidth(waveStrokeWidth); // 璁剧疆鐢荤瑪绮楃粏 mWavePaint.setAntiAlias(true); mWavePaint.setFilterBitmap(true); mWavePaint.setStrokeCap(Paint.Cap.ROUND); @@ -78,68 +99,138 @@ private void initPainters() { Shader shader = new LinearGradient(0, 0, 1000, 0, 0xffffffff, 0xFFe850ee, Shader.TileMode.CLAMP); mWavePaint.setShader(shader); baseLinePaint = new Paint(); - baseLinePaint.setColor(mBaseLineColor);// 鐢荤瑪涓篶olor - baseLinePaint.setStrokeWidth(1f);// 璁剧疆鐢荤瑪绮楃粏 + baseLinePaint.setColor(mBaseLineColor); // 鐢荤瑪涓篶olor + baseLinePaint.setStrokeWidth(1f); // 璁剧疆鐢荤瑪绮楃粏 baseLinePaint.setAntiAlias(true); baseLinePaint.setFilterBitmap(true); baseLinePaint.setStyle(Paint.Style.FILL); } + /** + * Gets max. + * + * @return the max + */ public short getMax() { return max; } + /** + * Sets max. + * + * @param max the max + */ public void setMax(short max) { this.max = max; } + /** + * Gets space. + * + * @return the space + */ public float getSpace() { return space; } + /** + * Sets space. + * + * @param space the space + */ public void setSpace(float space) { this.space = space; } + /** + * Gets wave color. + * + * @return the wave color + */ public int getmWaveColor() { return mWaveColor; } + /** + * Sets wave color. + * + * @param mWaveColor the m wave color + */ public void setmWaveColor(int mWaveColor) { this.mWaveColor = mWaveColor; invalidateNow(); } + /** + * Gets base line color. + * + * @return the base line color + */ public int getmBaseLineColor() { return mBaseLineColor; } + /** + * Sets base line color. + * + * @param mBaseLineColor the m base line color + */ public void setmBaseLineColor(int mBaseLineColor) { this.mBaseLineColor = mBaseLineColor; invalidateNow(); } + /** + * Gets wave stroke width. + * + * @return the wave stroke width + */ public float getWaveStrokeWidth() { return waveStrokeWidth; } + /** + * Sets wave stroke width. + * + * @param waveStrokeWidth the wave stroke width + */ public void setWaveStrokeWidth(float waveStrokeWidth) { this.waveStrokeWidth = waveStrokeWidth; invalidateNow(); } + /** + * Gets invalidate time. + * + * @return the invalidate time + */ public int getInvalidateTime() { return invalidateTime; } + /** + * Sets invalidate time. + * + * @param invalidateTime the invalidate time + */ public void setInvalidateTime(int invalidateTime) { this.invalidateTime = invalidateTime; } + /** + * Is max constant boolean. + * + * @return the boolean + */ public boolean isMaxConstant() { return isMaxConstant; } + /** + * Sets max constant. + * + * @param maxConstant the max constant + */ public void setMaxConstant(boolean maxConstant) { isMaxConstant = maxConstant; } @@ -152,6 +243,11 @@ public void invalidateNow() { invalidate(); } + /** + * Add data. + * + * @param data the data + */ public void addData(short data) { if (data < 0) { @@ -175,6 +271,9 @@ public void addData(short data) { } + /** + * Clear. + */ public void clear() { datas.clear(); invalidateNow(); @@ -196,7 +295,7 @@ protected void onSizeChanged(int w, int h, int oldw, int oldh) { private void drawWave(Canvas mCanvas) { for (int i = 0; i < datas.size(); i++) { - float x = (i) * space; + float x = i * space; float y = (float) datas.get(i) / max * mHeight / 2; mCanvas.drawLine(x, -y, x, y, mWavePaint); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/AudienceFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/AudienceFragment.java index f1fc70f00..ecffd3757 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/AudienceFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/AudienceFragment.java @@ -34,6 +34,8 @@ import io.agora.api.example.common.BaseFragment; import io.agora.mediaplayer.IMediaPlayer; import io.agora.mediaplayer.IMediaPlayerObserver; +import io.agora.mediaplayer.data.CacheStatistics; +import io.agora.mediaplayer.data.PlayerPlaybackStats; import io.agora.mediaplayer.data.PlayerUpdatedInfo; import io.agora.mediaplayer.data.SrcInfo; import io.agora.rtc2.ChannelMediaOptions; @@ -45,6 +47,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Audience fragment. + */ public class AudienceFragment extends BaseFragment implements IMediaPlayerObserver { private static final String TAG = AudienceFragment.class.getSimpleName(); private static final String AGORA_CHANNEL_PREFIX = "rtmp://pull.webdemo.agoraio.cn/lbhd/"; @@ -104,29 +109,29 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -192,7 +197,7 @@ public void onDestroy() { engine.leaveChannel(); } mediaPlayer.stop(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ engine.stopPreview(); handler.post(RtcEngine::destroy); engine = null; @@ -252,7 +257,7 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; @@ -263,7 +268,7 @@ public void onUserJoined(int uid, int elapsed) { handler.post(new Runnable() { @Override public void run() { - /**Display remote video stream*/ + /*Display remote video stream*/ SurfaceView surfaceView = null; // Create render view by RtcEngine surfaceView = RtcEngine.CreateRendererView(context); @@ -297,7 +302,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -419,12 +424,12 @@ private String getUrl() { } @Override - public void onPlayerStateChanged(io.agora.mediaplayer.Constants.MediaPlayerState mediaPlayerState, io.agora.mediaplayer.Constants.MediaPlayerError mediaPlayerError) { - showShortToast("player state change to " + mediaPlayerState.name()); + public void onPlayerStateChanged(io.agora.mediaplayer.Constants.MediaPlayerState state, io.agora.mediaplayer.Constants.MediaPlayerReason reason) { + showShortToast("player state change to " + state.name()); handler.post(new Runnable() { @Override public void run() { - switch (mediaPlayerState) { + switch (state) { case PLAYER_STATE_FAILED: mediaPlayer.stop(); //showLongToast(String.format("media player error: %s", mediaPlayerError.name())); @@ -439,13 +444,14 @@ public void run() { .setPositiveButton(R.string.confirm, (dialog, which) -> openPlayerWithUrl()) .create(); } - mPlayerFailDialog.setMessage(getString(R.string.media_player_error, mediaPlayerError.name()) + "\n\n" + getString(R.string.reopen_url_again)); + mPlayerFailDialog.setMessage(getString(R.string.media_player_error, reason.name()) + "\n\n" + getString(R.string.reopen_url_again)); mPlayerFailDialog.show(); break; case PLAYER_STATE_OPEN_COMPLETED: mediaPlayer.play(); - if (isAgoraChannel) + if (isAgoraChannel) { loadAgoraChannels(); + } rtcSwitcher.setEnabled(true); if (mPlayerFailDialog != null) { mPlayerFailDialog.dismiss(); @@ -492,7 +498,7 @@ private List getChannelArray(int count) { } @Override - public void onPositionChanged(long l) { + public void onPositionChanged(long positionMs, long timestampMs) { } @@ -547,6 +553,16 @@ public void onPlayerInfoUpdated(PlayerUpdatedInfo playerUpdatedInfo) { } + @Override + public void onPlayerCacheStats(CacheStatistics stats) { + + } + + @Override + public void onPlayerPlaybackStats(PlayerPlaybackStats stats) { + + } + @Override public void onAudioVolumeIndication(int i) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/EntryFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/EntryFragment.java index 81045d696..bed4e0185 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/EntryFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/EntryFragment.java @@ -21,6 +21,9 @@ import io.agora.api.example.annotation.Example; import io.agora.api.example.common.BaseFragment; +/** + * The type Entry fragment. + */ @Example( index = 2, group = ADVANCED, @@ -28,31 +31,28 @@ actionId = R.id.action_mainFragment_to_CDNStreaming, tipsId = R.string.rtmpstreaming ) -public class EntryFragment extends BaseFragment implements View.OnClickListener -{ +public class EntryFragment extends BaseFragment implements View.OnClickListener { private static final String TAG = EntryFragment.class.getSimpleName(); private Spinner streamMode; private EditText et_channel; - private boolean isAgoraChannel(){ + private boolean isAgoraChannel() { return "AGORA_CHANNEL".equals(streamMode.getSelectedItem().toString()); } - private String getChannelName(){ + private String getChannelName() { return et_channel.getText().toString(); } @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_cdn_entry, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); view.findViewById(R.id.btn_host_join).setOnClickListener(this); view.findViewById(R.id.btn_audience_join).setOnClickListener(this); @@ -61,7 +61,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat streamMode.setOnItemSelectedListener(new StreamModeOnItemSelectedListener()); } - private class StreamModeOnItemSelectedListener implements AdapterView.OnItemSelectedListener { + private final class StreamModeOnItemSelectedListener implements AdapterView.OnItemSelectedListener { @Override public void onItemSelected(AdapterView adapter, View view, int position, long id) { et_channel.setHint(position == 0 ? R.string.agora_channel_hint : R.string.cdn_url_hint); @@ -73,23 +73,19 @@ public void onNothingSelected(AdapterView arg0) { } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); } @Override - public void onClick(View v) - { + public void onClick(View v) { // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { join(v); return; } @@ -98,15 +94,14 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted join(v); }).start(); } private void join(View v) { - if (v.getId() == R.id.btn_host_join){ + if (v.getId() == R.id.btn_host_join) { Bundle bundle = new Bundle(); bundle.putString(getString(R.string.key_channel_name), getChannelName()); bundle.putBoolean(getString(R.string.key_is_agora_channel), isAgoraChannel()); @@ -114,8 +109,7 @@ private void join(View v) { R.id.action_cdn_streaming_to_host, bundle ); - } - else if (v.getId() == R.id.btn_audience_join){ + } else if (v.getId() == R.id.btn_audience_join) { Bundle bundle = new Bundle(); bundle.putString(getString(R.string.key_channel_name), getChannelName()); bundle.putBoolean(getString(R.string.key_is_agora_channel), isAgoraChannel()); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/HostFragment.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/HostFragment.java index 6f065a308..90d6dee98 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/HostFragment.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CDNStreaming/HostFragment.java @@ -30,8 +30,8 @@ import io.agora.api.example.common.BaseFragment; import io.agora.rtc2.ChannelMediaOptions; import io.agora.rtc2.Constants; -import io.agora.rtc2.DirectCdnStreamingError; import io.agora.rtc2.DirectCdnStreamingMediaOptions; +import io.agora.rtc2.DirectCdnStreamingReason; import io.agora.rtc2.DirectCdnStreamingState; import io.agora.rtc2.DirectCdnStreamingStats; import io.agora.rtc2.IDirectCdnStreamingEventHandler; @@ -45,6 +45,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Host fragment. + */ public class HostFragment extends BaseFragment { private static final String TAG = HostFragment.class.getSimpleName(); private static final String AGORA_CHANNEL_PREFIX = "rtmp://push.webdemo.agoraio.cn/lbhd/"; @@ -112,29 +115,29 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -179,7 +182,7 @@ private void setupEngineConfig(Context context) { engine.startPreview(); // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -187,7 +190,6 @@ private void setupEngineConfig(Context context) { VideoEncoderConfiguration.VideoDimensions videoDimensions = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(); canvas_height = Math.max(videoDimensions.height, videoDimensions.width); canvas_width = Math.min(videoDimensions.height, videoDimensions.width); -// VideoEncoderConfiguration.FRAME_RATE frameRate = VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()); videoEncoderConfiguration = new VideoEncoderConfiguration( videoDimensions, VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_15, STANDARD_BITRATE, VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_FIXED_PORTRAIT ); @@ -254,7 +256,7 @@ public void onDestroy() { } else if (cdnStreaming) { engine.stopDirectCdnStreaming(); } - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ engine.stopPreview(); handler.post(RtcEngine::destroy); engine = null; @@ -319,7 +321,7 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; @@ -331,7 +333,7 @@ public void onUserJoined(int uid, int elapsed) { handler.post(new Runnable() { @Override public void run() { - /**Display remote video stream*/ + /*Display remote video stream*/ SurfaceView surfaceView = null; // Create render view by RtcEngine surfaceView = RtcEngine.CreateRendererView(context); @@ -365,7 +367,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -417,7 +419,7 @@ private void updateTranscodeLayout() { user.height = hasRemote ? canvas_height / 2 : canvas_height; user.uid = localUid; liveTranscoding.addUser(user); - if(hasRemote){ + if (hasRemote) { int index = 0; for (int uid : remoteViews.keySet()) { index++; @@ -459,13 +461,14 @@ private void updateTranscodeLayout() { private final IDirectCdnStreamingEventHandler iDirectCdnStreamingEventHandler = new IDirectCdnStreamingEventHandler() { + @Override - public void onDirectCdnStreamingStateChanged(DirectCdnStreamingState directCdnStreamingState, DirectCdnStreamingError directCdnStreamingError, String s) { - showShortToast(String.format("onDirectCdnStreamingStateChanged state:%s, error:%s", directCdnStreamingState, directCdnStreamingError)); + public void onDirectCdnStreamingStateChanged(DirectCdnStreamingState state, DirectCdnStreamingReason reason, String message) { + showShortToast(String.format("onDirectCdnStreamingStateChanged state:%s, error:%s", state, reason)); runOnUIThread(new Runnable() { @Override public void run() { - switch (directCdnStreamingState) { + switch (state) { case RUNNING: streamingButton.setText(R.string.stop_streaming); cdnStreaming = true; @@ -489,7 +492,7 @@ public void run() { case FAILED: showLongToast(String.format("Start Streaming failed, please go back to previous page and check the settings.")); default: - Log.i(TAG, String.format("onDirectCdnStreamingStateChanged, state: %s error: %s message: %s", directCdnStreamingState.name(), directCdnStreamingError.name(), s)); + Log.i(TAG, String.format("onDirectCdnStreamingStateChanged, state: %s error: %s message: %s", state.name(), reason.name(), message)); } rtcSwitcher.setEnabled(true); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ChannelEncryption.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ChannelEncryption.java index bbbfda211..2e555a3d1 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ChannelEncryption.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ChannelEncryption.java @@ -14,10 +14,12 @@ import android.view.SurfaceView; import android.view.View; import android.view.ViewGroup; +import android.widget.AdapterView; import android.widget.Button; import android.widget.EditText; import android.widget.FrameLayout; import android.widget.Spinner; +import android.widget.Toast; import androidx.annotation.NonNull; import androidx.annotation.Nullable; @@ -25,6 +27,7 @@ import com.yanzhenjie.permission.AndPermission; import com.yanzhenjie.permission.runtime.Permission; +import java.lang.reflect.Method; import java.nio.charset.StandardCharsets; import io.agora.api.example.MainApplication; @@ -42,7 +45,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; -/**This demo demonstrates how to make a one-to-one video call*/ +/** + * This demo demonstrates how to make a one-to-one video call + */ @Example( index = 14, group = ADVANCED, @@ -50,8 +55,7 @@ actionId = R.id.action_mainFragment_to_channel_encryption, tipsId = R.string.channelencryption ) -public class ChannelEncryption extends BaseFragment implements View.OnClickListener -{ +public class ChannelEncryption extends BaseFragment implements View.OnClickListener, AdapterView.OnItemSelectedListener { private static final String TAG = ChannelEncryption.class.getSimpleName(); private FrameLayout fl_local, fl_remote; @@ -64,15 +68,13 @@ public class ChannelEncryption extends BaseFragment implements View.OnClickListe @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_channel_encryption, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); et_channel = view.findViewById(R.id.et_channel); @@ -81,45 +83,43 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat fl_local = view.findViewById(R.id.fl_local); fl_remote = view.findViewById(R.id.fl_remote); encry_mode = view.findViewById(R.id.encry_mode_spinner); + encry_mode.setOnItemSelectedListener(this); } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -137,21 +137,20 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + if (joined && encry_mode.getSelectedItem().toString().equals(getString(R.string.custom))) { + enablePacketProcessor(false); + } + /* leaveChannel and Destroy the RtcEngine instance */ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -159,27 +158,44 @@ public void onDestroy() } @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { - // Creates an EncryptionConfig instance. - EncryptionConfig config = new EncryptionConfig(); - // Sets the encryption mode as AES_128_XTS. - config.encryptionMode = EncryptionConfig.EncryptionMode.valueOf(encry_mode.getSelectedItem().toString()); - // Sets the encryption key. - config.encryptionKey = et_password.getText().toString(); - System.arraycopy(getKdfSaltFromServer(), 0, config.encryptionKdfSalt, 0, config.encryptionKdfSalt.length); - // Enables the built-in encryption. - engine.enableEncryption(true, config); + public void onItemSelected(AdapterView parent, View view, int position, long id) { + if (parent == encry_mode) { + if (encry_mode.getSelectedItem().equals(getString(R.string.custom))) { + et_password.setText(""); + et_password.setEnabled(false); + } else { + et_password.setEnabled(true); + } + } + } + + @Override + public void onNothingSelected(AdapterView parent) { + + } + + @Override + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { + if (encry_mode.getSelectedItem().toString().equals(getString(R.string.custom))) { + enablePacketProcessor(true); + } else { + // Creates an EncryptionConfig instance. + EncryptionConfig config = new EncryptionConfig(); + // Sets the encryption mode as AES_128_XTS. + config.encryptionMode = EncryptionConfig.EncryptionMode.valueOf(encry_mode.getSelectedItem().toString()); + // Sets the encryption key. + config.encryptionKey = et_password.getText().toString(); + System.arraycopy(getKdfSaltFromServer(), 0, config.encryptionKdfSalt, 0, config.encryptionKdfSalt.length); + // Enables the built-in encryption. + engine.enableEncryption(true, config); + } CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -188,16 +204,16 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + if (encry_mode.getSelectedItem().toString().equals(getString(R.string.custom))) { + enablePacketProcessor(false); + } + /* After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -216,22 +232,36 @@ public void onClick(View v) * triggers the removeInjectStreamUrl method.*/ engine.leaveChannel(); join.setText(getString(R.string.join)); - et_password.setEnabled(true); + et_password.setEnabled(!encry_mode.getSelectedItem().equals(getString(R.string.custom))); encry_mode.setEnabled(true); } } } + private void enablePacketProcessor(boolean enable) { + try { + Class aClass = Class.forName("io.agora.api.streamencrypt.PacketProcessor"); + if (enable) { + Method registerProcessing = aClass.getDeclaredMethod("registerProcessing", long.class); + registerProcessing.invoke(null, engine.getNativeHandle()); + } else { + Method unregisterProcessing = aClass.getDeclaredMethod("unregisterProcessing", long.class); + unregisterProcessing.invoke(null, engine.getNativeHandle()); + } + } catch (Exception e) { + Log.w(TAG, "", e); + Toast.makeText(requireContext(), R.string.custom_stream_encrypt_tip, Toast.LENGTH_SHORT).show(); + } + } + private byte[] getKdfSaltFromServer() { return "EncryptionKdfSaltInBase64Strings".getBytes(StandardCharsets.UTF_8); } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } @@ -239,8 +269,7 @@ private void joinChannel(String channelId) SurfaceView surfaceView = new SurfaceView(context); // Local video is on the top surfaceView.setZOrderMediaOverlay(true); - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } // Add to the local container @@ -249,7 +278,7 @@ private void joinChannel(String channelId) engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -261,17 +290,16 @@ private void joinChannel(String channelId) ORIENTATION_MODE_ADAPTIVE )); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, "Extra Optional Data", 0); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -289,16 +317,14 @@ private void joinChannel(String channelId) * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.e(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); showAlert(String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err)), false); } @@ -307,8 +333,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -322,17 +347,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); et_password.setEnabled(false); @@ -417,8 +439,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -428,23 +449,20 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); resetAlert(); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Create render view by RtcEngine @@ -470,15 +488,14 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); resetAlert(); handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -486,4 +503,6 @@ public void run() { }); } }; + + } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ContentInspect.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ContentInspect.java index 2840b4988..8a36ffac1 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ContentInspect.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ContentInspect.java @@ -75,7 +75,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat fl_local = view.findViewById(R.id.fl_local); contentInspectRetTv = view.findViewById(R.id.ret_content_inspect); view.findViewById(R.id.btn_switch_camera).setOnClickListener(v -> { - if(engine != null && joined){ + if (engine != null && joined) { engine.switchCamera(); } }); @@ -91,30 +91,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -141,7 +141,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -166,14 +166,13 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -216,7 +215,7 @@ private void joinChannel(String channelId) { // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -238,7 +237,7 @@ private void joinChannel(String channelId) { engine.enableContentInspect(true, contentInspectConfig); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -246,7 +245,7 @@ private void joinChannel(String channelId) { TokenUtils.gen(requireContext(), channelId, 0, new TokenUtils.OnTokenGenCallback() { @Override public void onTokenGen(String ret) { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CustomRemoteVideoRender.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CustomRemoteVideoRender.java index 8b386691d..13edffa2e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CustomRemoteVideoRender.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/CustomRemoteVideoRender.java @@ -106,30 +106,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -147,8 +147,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -157,16 +156,16 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + + engine.registerVideoFrameObserver(null); + + /*leaveChannel and Destroy the RtcEngine instance*/ if (textureBufferHelper != null) { textureBufferHelper.dispose(); textureBufferHelper = null; } - if (yuvUploader != null) { - yuvUploader.release(); - } - if(engine != null) - { + yuvUploader.release(); + if (engine != null) { engine.leaveChannel(); engine.stopPreview(); } @@ -191,14 +190,13 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -215,6 +213,7 @@ public void onClick(View v) { * the onLeaveChannel callback. * 2:If you call the leaveChannel method during CDN live streaming, the SDK * triggers the removeInjectStreamUrl method.*/ + engine.registerVideoFrameObserver(null); engine.leaveChannel(); engine.stopPreview(); remoteUid = 0; @@ -241,25 +240,25 @@ private void joinChannel(String channelId) { fl_local.addView(surfaceView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); engine.registerVideoFrameObserver(videoFrameObserver); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); engine.startPreview(); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -268,7 +267,7 @@ private void joinChannel(String channelId) { @Override public void onTokenGen(String ret) { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -314,7 +313,7 @@ public void onLeaveChannel(RtcStats stats) { Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); lastI420Frame = null; - if(mSurfaceView != null){ + if (mSurfaceView != null) { mSurfaceView.requestRender(); } } @@ -426,27 +425,24 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); remoteUid = uid; Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null || mSurfaceView != null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ mSurfaceView = new GLTextureView(context); mSurfaceView.setPreserveEGLContextOnPause(true); mSurfaceView.setEGLContextClientVersion(2); mSurfaceView.setRenderer(glRenderer); mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY); - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Add to the remote container @@ -468,13 +464,16 @@ public void onUserJoined(int uid, int elapsed) public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); - if(mSurfaceView != null){ + if (mSurfaceView != null) { mSurfaceView.requestRender(); } remoteUid = 0; } }; + /** + * The Video frame observer. + */ IVideoFrameObserver videoFrameObserver = new IVideoFrameObserver() { @Override public boolean onCaptureVideoFrame(int sourceType, VideoFrame videoFrame) { @@ -494,7 +493,7 @@ public boolean onMediaPlayerVideoFrame(VideoFrame videoFrame, int i) { @Override public boolean onRenderVideoFrame(String s, int i, VideoFrame videoFrame) { // Log.d(TAG, "onRenderVideoFrame: " + i + " connection: " + rtcConnection.id + " buffer: " + videoFrame.getBuffer()); - if (mSurfaceView != null && videoFrame != lastI420Frame){ + if (mSurfaceView != null && videoFrame != lastI420Frame) { Log.d(TAG, "onRenderVideoFrame: " + i + " connection: " + s + " buffer: " + videoFrame.getBuffer()); lastI420Frame = videoFrame; textureBufferHelper.invoke(new Callable() { @@ -538,7 +537,10 @@ public int getObservedFramePosition() { return 0; } }; - GLTextureView.Renderer glRenderer = new GLTextureView.Renderer(){ + /** + * The Gl renderer. + */ + GLTextureView.Renderer glRenderer = new GLTextureView.Renderer() { @Override public void onSurfaceCreated(GL10 gl, EGLConfig config) { Log.d(TAG, "onSurfaceCreated"); @@ -556,7 +558,9 @@ public void onSurfaceChanged(GL10 gl, int width, int height) { public void onDrawFrame(GL10 gl) { GLES20.glClearColor(0 /* red */, 0 /* green */, 0 /* blue */, 0 /* alpha */); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); - if (lastI420Frame == null) return; + if (lastI420Frame == null) { + return; + } Log.d(TAG, "onDrawFrame: " + lastI420Frame.getRotation()); renderMatrix.reset(); renderMatrix.preTranslate(0.5f, 0.5f); @@ -567,8 +571,7 @@ public void onDrawFrame(GL10 gl) { drawer.drawYuv(yuvUploader.getYuvTextures(), RendererCommon.convertMatrixFromAndroidGraphicsMatrix(renderMatrix), lastI420Frame.getRotatedWidth(), lastI420Frame.getRotatedHeight(), 0, 0, viewportWidth, viewportHeight); - } - catch (NullPointerException exception){ + } catch (NullPointerException exception) { Log.e(TAG, "skip empty buffer!"); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/FaceCapture.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/FaceCapture.java new file mode 100644 index 000000000..efb55f1ac --- /dev/null +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/FaceCapture.java @@ -0,0 +1,419 @@ +package io.agora.api.example.examples.advanced; + +import static io.agora.api.example.common.model.Examples.ADVANCED; +import static io.agora.rtc2.Constants.MediaSourceType.PRIMARY_CAMERA_SOURCE; +import static io.agora.rtc2.video.VideoCanvas.RENDER_MODE_HIDDEN; +import static io.agora.rtc2.video.VideoEncoderConfiguration.STANDARD_BITRATE; + +import android.content.Context; +import android.os.Bundle; +import android.util.Log; +import android.view.LayoutInflater; +import android.view.SurfaceView; +import android.view.View; +import android.view.ViewGroup; +import android.widget.Button; +import android.widget.EditText; +import android.widget.FrameLayout; + +import androidx.annotation.NonNull; +import androidx.annotation.Nullable; + +import java.util.Locale; + +import io.agora.api.example.MainApplication; +import io.agora.api.example.R; +import io.agora.api.example.annotation.Example; +import io.agora.api.example.common.BaseFragment; +import io.agora.api.example.utils.CommonUtil; +import io.agora.api.example.utils.TokenUtils; +import io.agora.base.VideoFrame; +import io.agora.rtc2.ChannelMediaOptions; +import io.agora.rtc2.Constants; +import io.agora.rtc2.IMediaExtensionObserver; +import io.agora.rtc2.IRtcEngineEventHandler; +import io.agora.rtc2.RtcEngine; +import io.agora.rtc2.RtcEngineConfig; +import io.agora.rtc2.proxy.LocalAccessPointConfiguration; +import io.agora.rtc2.video.IVideoFrameObserver; +import io.agora.rtc2.video.VideoCanvas; +import io.agora.rtc2.video.VideoEncoderConfiguration; + +/** + * The type Process raw data. + */ +@Example( + index = 12, + group = ADVANCED, + name = R.string.item_face_capture, + actionId = R.id.action_mainFragment_to_face_capture, + tipsId = R.string.face_capture_tip +) +public class FaceCapture extends BaseFragment implements View.OnClickListener { + private static final String TAG = FaceCapture.class.getSimpleName(); + + private static final String AUTHENTICATION = ""; + + private FrameLayout fl_local; + private Button join; + private EditText et_channel, et_capture_info; + private RtcEngine engine; + private int myUid; + private boolean joined = false; + + @Override + public void onCreate(@Nullable Bundle savedInstanceState) { + super.onCreate(savedInstanceState); + // Check if the context is valid + Context context = getContext(); + if (context == null) { + return; + } + try { + RtcEngineConfig config = new RtcEngineConfig(); + /* + * The context of Android Activity + */ + config.mContext = context.getApplicationContext(); + /* + * The App ID issued to you by Agora. See How to get the App ID + */ + config.mAppId = getString(R.string.agora_app_id); + /* Sets the channel profile of the Agora RtcEngine. + CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. + Use this profile in one-on-one calls or group calls, where all users can talk freely. + CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast + channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; + an audience can only receive streams.*/ + config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; + /* + * IRtcEngineEventHandler is an abstract class providing default implementation. + * The SDK uses this class to report to the app on SDK runtime events. + */ + config.mEventHandler = iRtcEngineEventHandler; + config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); + config.mAreaCode = ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getAreaCode(); + /* For Android, the agora_face_capture_extension will not load default. You must add it manually. */ + config.addExtension("agora_face_capture_extension"); + /* Config extension observer to receive the events. */ + config.mExtensionObserver = iMediaExtensionObserver; + engine = RtcEngine.create(config); + /* + * This parameter is for reporting the usages of APIExample to agora background. + * Generally, it is not necessary for you to set this parameter. + */ + engine.setParameters("{" + + "\"rtc.report_app_scenario\":" + + "{" + + "\"appScenario\":" + 100 + "," + + "\"serviceType\":" + 11 + "," + + "\"appVersion\":\"" + RtcEngine.getSdkVersion() + "\"" + + "}" + + "}"); + /* setting the local access point if the private cloud ip was set, otherwise the config will be invalid.*/ + LocalAccessPointConfiguration localAccessPointConfiguration = ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getPrivateCloudConfig(); + if (localAccessPointConfiguration != null) { + // This api can only be used in the private media server scenario, otherwise some problems may occur. + engine.setLocalAccessPoint(localAccessPointConfiguration); + } + + engine.registerVideoFrameObserver(iVideoFrameObserver); + engine.enableExtension("agora_video_filters_face_capture", "face_capture", true, PRIMARY_CAMERA_SOURCE); + engine.setExtensionProperty("agora_video_filters_face_capture", + "face_capture", + "authentication_information", + "{\"company_id\":\"agoraTest\"," + + "\"license\":\"" + + AUTHENTICATION + + "\"}" + ); + } catch (Exception e) { + e.printStackTrace(); + requireActivity().onBackPressed(); + } + } + + @Nullable + @Override + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { + return inflater.inflate(R.layout.fragment_face_capture, container, false); + } + + @Override + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { + super.onViewCreated(view, savedInstanceState); + join = view.findViewById(R.id.btn_join); + et_channel = view.findViewById(R.id.et_channel); + join.setOnClickListener(this); + fl_local = view.findViewById(R.id.fl_local); + et_capture_info = view.findViewById(R.id.et_capture_info); + } + + + @Override + public void onDestroy() { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { + engine.leaveChannel(); + engine.stopPreview(); + } + engine = null; + super.onDestroy(); + handler.post(RtcEngine::destroy); + } + + @Override + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { + CommonUtil.hideInputBoard(requireActivity(), et_channel); + // call when join button hit + String channelId = et_channel.getText().toString(); + joinChannel(channelId); + } else { + joined = false; + /*After joining a channel, the user must call the leaveChannel method to end the + * call before joining another channel. This method returns 0 if the user leaves the + * channel and releases all resources related to the call. This method call is + * asynchronous, and the user has not exited the channel when the method call returns. + * Once the user leaves the channel, the SDK triggers the onLeaveChannel callback. + * A successful leaveChannel method call triggers the following callbacks: + * 1:The local client: onLeaveChannel. + * 2:The remote client: onUserOffline, if the user leaving the channel is in the + * Communication channel, or is a BROADCASTER in the Live Broadcast profile. + * @returns 0: Success. + * < 0: Failure. + * PS: + * 1:If you call the destroy method immediately after calling the leaveChannel + * method, the leaveChannel process interrupts, and the SDK does not trigger + * the onLeaveChannel callback. + * 2:If you call the leaveChannel method during CDN live streaming, the SDK + * triggers the removeInjectStreamUrl method.*/ + engine.leaveChannel(); + engine.stopPreview(); + join.setText(getString(R.string.join)); + } + } + } + + private void joinChannel(String channelId) { + // Check if the context is valid + Context context = getContext(); + if (context == null) { + return; + } + + // Create render view by RtcEngine + SurfaceView surfaceView = new SurfaceView(context); + // Add to the local container + fl_local.addView(surfaceView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); + // Setup local video to render your local camera preview + engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); + + /*In the demo, the default is to enter as the anchor.*/ + engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); + // Setup video encoding configs + engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( + ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + STANDARD_BITRATE, + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + )); + /*Set up to play remote sound with receiver*/ + engine.setDefaultAudioRoutetoSpeakerphone(true); + + engine.enableVideo(); + + engine.startPreview(); + + /*Please configure accessToken in the string_config file. + * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see + * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token + * A token generated at the server. This applies to scenarios with high-security requirements. For details, see + * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ + TokenUtils.gen(requireContext(), channelId, 0, token -> { + /* Allows a user to join a channel. + if you do not specify the uid, we will generate the uid for you*/ + + ChannelMediaOptions option = new ChannelMediaOptions(); + option.autoSubscribeAudio = true; + option.autoSubscribeVideo = true; + option.publishCameraTrack = true; + int res = engine.joinChannel(token, channelId, 0, option); + if (res != 0) { + // Usually happens with invalid parameters + // Error code description can be found at: + // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html + // cn: https://docs.agora.io/cn/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html + showAlert(RtcEngine.getErrorDescription(Math.abs(res))); + return; + } + // Prevent repeated entry + join.setEnabled(false); + }); + } + + private final IVideoFrameObserver iVideoFrameObserver = new IVideoFrameObserver() { + @Override + public boolean onRenderVideoFrame(String channelId, int uid, VideoFrame videoFrame) { + return false; + } + + @Override + public boolean onCaptureVideoFrame(int sourceType, VideoFrame videoFrame) { + Log.i(TAG, String.format(Locale.US, "VideoFrameObserver >> onCaptureVideoFrame : metadata=%s", + videoFrame.getMetaInfo().toString())); + runOnUIThread(() -> et_capture_info.setText(videoFrame.getMetaInfo().toString())); + return true; + } + + @Override + public boolean onPreEncodeVideoFrame(int sourceType, VideoFrame videoFrame) { + return false; + } + + @Override + public boolean onMediaPlayerVideoFrame(VideoFrame videoFrame, int mediaPlayerId) { + return false; + } + + @Override + public int getVideoFrameProcessMode() { + return PROCESS_MODE_READ_ONLY; + } + + @Override + public int getVideoFormatPreference() { + return VIDEO_PIXEL_DEFAULT; + } + + @Override + public boolean getRotationApplied() { + return false; + } + + @Override + public boolean getMirrorApplied() { + return false; + } + + @Override + public int getObservedFramePosition() { + return POSITION_POST_CAPTURER; + } + }; + + private final IMediaExtensionObserver iMediaExtensionObserver = new IMediaExtensionObserver() { + @Override + public void onEvent(String provider, String extension, String key, String value) { + Log.i(TAG, String.format(Locale.US, "ExtensionObserver >> onEvent : provider=%s, extension=%s, key=%s, value=%s", + provider, extension, key, value)); + if ("agora_video_filters_face_capture".equals(provider) + && "face_capture".equals(extension) + && "authentication_state".equals(key)) { + if ("0".equals(value)) { + showShortToast("Face capture authentication successful."); + } else if ("-1".equals(value)) { + showShortToast("Face capture authentication failed!"); + } else if ("-2".equals(value)) { + showShortToast("Face capture authentication information not set!"); + showAlert(getString(R.string.face_capture_authentication), false); + } + } + } + + @Override + public void onStarted(String provider, String extension) { + Log.i(TAG, String.format(Locale.US, "ExtensionObserver >> onStarted : provider=%s, extension=%s", + provider, extension)); + } + + @Override + public void onStopped(String provider, String extension) { + Log.i(TAG, String.format(Locale.US, "ExtensionObserver >> onStopped : provider=%s, extension=%s", + provider, extension)); + } + + @Override + public void onError(String provider, String extension, int error, String message) { + Log.i(TAG, String.format(Locale.US, "ExtensionObserver >> onError : provider=%s, extension=%s, error=%d, message=%s", + provider, extension, error, message)); + } + }; + + /** + * IRtcEngineEventHandler is an abstract class providing default implementation. + * The SDK uses this class to report to the app on SDK runtime events. + */ + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { + /** + * Error code description can be found at: + * en: ... + * cn: ... + */ + @Override + public void onError(int err) { + Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); + } + + + /**Occurs when a user leaves the channel. + * @param stats With this callback, the application retrieves the channel information, + * such as the call duration and statistics.*/ + @Override + public void onLeaveChannel(RtcStats stats) { + super.onLeaveChannel(stats); + Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); + showLongToast(String.format(Locale.US, "local user %d leaveChannel!", myUid)); + } + + /**Occurs when the local user joins a specified channel. + * The channel name assignment is based on channelName specified in the joinChannel method. + * If the uid is not specified when joinChannel is called, the server automatically assigns a uid. + * @param channel Channel name + * @param uid User ID + * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ + @Override + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { + Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); + showLongToast(String.format(Locale.US, "onJoinChannelSuccess channel %s uid %d", channel, uid)); + myUid = uid; + joined = true; + handler.post(new Runnable() { + @Override + public void run() { + join.setEnabled(true); + join.setText(getString(R.string.leave)); + } + }); + } + + /**Occurs when a remote user (Communication)/host (Live Broadcast) joins the channel. + * @param uid ID of the user whose audio state changes. + * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole + * until this callback is triggered.*/ + @Override + public void onUserJoined(int uid, int elapsed) { + super.onUserJoined(uid, elapsed); + Log.i(TAG, "onUserJoined->" + uid); + showLongToast(String.format(Locale.US, "user %d joined!", uid)); + } + + /**Occurs when a remote user (Communication)/host (Live Broadcast) leaves the channel. + * @param uid ID of the user whose audio state changes. + * @param reason Reason why the user goes offline: + * USER_OFFLINE_QUIT(0): The user left the current channel. + * USER_OFFLINE_DROPPED(1): The SDK timed out and the user dropped offline because no data + * packet was received within a certain period of time. If a user quits the + * call and the message is not passed to the SDK (due to an unreliable channel), + * the SDK assumes the user dropped offline. + * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from + * the host to the audience.*/ + @Override + public void onUserOffline(int uid, int reason) { + Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); + showLongToast(String.format(Locale.US, "user %d offline! reason:%d", uid, reason)); + } + }; + +} diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/HostAcrossChannel.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/HostAcrossChannel.java index e5910f624..06d6fa0d0 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/HostAcrossChannel.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/HostAcrossChannel.java @@ -39,7 +39,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; -/**This demo demonstrates how to make a one-to-one video call*/ +/** + * This demo demonstrates how to make a one-to-one video call + */ @Example( index = 21, group = ADVANCED, @@ -47,8 +49,7 @@ actionId = R.id.action_mainFragment_to_hostacrosschannel, tipsId = R.string.hostacrosschannel ) -public class HostAcrossChannel extends BaseFragment implements View.OnClickListener -{ +public class HostAcrossChannel extends BaseFragment implements View.OnClickListener { private static final String TAG = HostAcrossChannel.class.getSimpleName(); private FrameLayout fl_local, fl_remote; @@ -62,15 +63,13 @@ public class HostAcrossChannel extends BaseFragment implements View.OnClickListe @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_host_across_channel, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); join_ex = view.findViewById(R.id.btn_join_ex); @@ -88,42 +87,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -141,21 +137,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); engine.stopChannelMediaRelay(); mediaRelaying = false; @@ -165,18 +157,14 @@ public void onDestroy() } @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -185,16 +173,13 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -215,11 +200,10 @@ public void onClick(View v) join.setText(getString(R.string.join)); join_ex.setText(getString(R.string.join)); } - } - else if(v.getId() == R.id.btn_join_ex){ - if(!mediaRelaying){ + } else if (v.getId() == R.id.btn_join_ex) { + if (!mediaRelaying) { String destChannelName = et_channel_ex.getText().toString(); - if(destChannelName.length() == 0){ + if (destChannelName.length() == 0) { showAlert("Destination channel name is empty!"); } @@ -228,26 +212,23 @@ else if(v.getId() == R.id.btn_join_ex){ mediaRelayConfiguration.setSrcChannelInfo(srcChannelInfo); ChannelMediaInfo destChannelInfo = new ChannelMediaInfo(destChannelName, null, myUid); mediaRelayConfiguration.setDestChannelInfo(destChannelName, destChannelInfo); - engine.startChannelMediaRelay(mediaRelayConfiguration); + engine.startOrUpdateChannelMediaRelay(mediaRelayConfiguration); et_channel_ex.setEnabled(false); join_ex.setEnabled(false); pause.setEnabled(true); - } - else{ + } else { engine.stopChannelMediaRelay(); et_channel_ex.setEnabled(true); pause.setEnabled(false); join_ex.setText(getString(R.string.join)); mediaRelaying = false; } - } - else if(v.getId() == R.id.btn_pause){ - if(!isPaused){ + } else if (v.getId() == R.id.btn_pause) { + if (!isPaused) { engine.pauseAllChannelMediaRelay(); isPaused = true; pause.setText(R.string.resume); - } - else{ + } else { engine.resumeAllChannelMediaRelay(); isPaused = false; pause.setText(R.string.pause); @@ -255,19 +236,16 @@ else if(v.getId() == R.id.btn_pause){ } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } // Add to the local container @@ -277,29 +255,28 @@ private void joinChannel(String channelId) // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, "Extra Optional Data", 0); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -318,16 +295,14 @@ private void joinChannel(String channelId) * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -336,8 +311,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -350,17 +324,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); join_ex.setEnabled(true); @@ -445,8 +416,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -456,22 +426,19 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Create render view by RtcEngine @@ -496,14 +463,13 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -538,21 +504,23 @@ public void run() { */ @Override public void onChannelMediaRelayStateChanged(int state, int code) { - switch (state){ + switch (state) { case RELAY_STATE_CONNECTING: mediaRelaying = true; - handler.post(() ->{ - et_channel_ex.setEnabled(false); - join_ex.setEnabled(true); - join_ex.setText(getText(R.string.stop)); - showLongToast("channel media Relay connected."); + handler.post(() -> { + et_channel_ex.setEnabled(false); + join_ex.setEnabled(true); + join_ex.setText(getText(R.string.stop)); + showLongToast("channel media Relay connected."); }); break; case RELAY_STATE_FAILURE: mediaRelaying = false; - handler.post(() ->{ + handler.post(() -> { showLongToast(String.format("channel media Relay failed at error code: %d", code)); }); + default: + break; } } }; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/InCallReport.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/InCallReport.java index 9bae5fb11..90c53c9ba 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/InCallReport.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/InCallReport.java @@ -44,9 +44,11 @@ //) /** + * The type In call report. * - * @deprecated The report has been moved to {@link io.agora.api.example.common.widget.VideoReportLayout}. - * You can refer to {@link LiveStreaming} or {@link io.agora.api.example.examples.basic.JoinChannelVideo} example. + * @deprecated The report has been moved to + * {@link io.agora.api.example.common.widget.VideoReportLayout}. + * You can refer to {@link LiveStreaming} or {@link io.agora.api.example.examples.basic.JoinChannelVideo} example. */ public class InCallReport extends BaseFragment implements View.OnClickListener { private static final String TAG = InCallReport.class.getSimpleName(); @@ -62,15 +64,13 @@ public class InCallReport extends BaseFragment implements View.OnClickListener { @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_in_call_report, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); statisticsInfo = new StatisticsInfo(); @@ -84,16 +84,16 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat fl_remote = view.findViewById(R.id.fl_remote); } - private void updateLocalStats(){ + private void updateLocalStats() { handler.post(new Runnable() { - @Override - public void run() { - localStats.setText(statisticsInfo.getLocalVideoStats()); - } - }); + @Override + public void run() { + localStats.setText(statisticsInfo.getLocalVideoStats()); + } + }); } - private void updateRemoteStats(){ + private void updateRemoteStats() { handler.post(new Runnable() { @Override public void run() { @@ -103,42 +103,39 @@ public void run() { } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -157,20 +154,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) engine.setLocalAccessPoint(localAccessPointConfiguration); } } - catch (Exception e) - { + catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); engine.stopPreview(); } @@ -179,18 +173,14 @@ public void onDestroy() } @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -199,16 +189,13 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -231,19 +218,16 @@ public void onClick(View v) } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } // Add to the local container @@ -253,7 +237,7 @@ private void joinChannel(String channelId) // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -261,23 +245,22 @@ private void joinChannel(String channelId) engine.startPreview(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, "Extra Optional Data", 0); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -294,16 +277,14 @@ private void joinChannel(String channelId) * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -312,8 +293,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -326,17 +306,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); } @@ -419,8 +396,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -430,22 +406,19 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Create render view by RtcEngine @@ -470,14 +443,13 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/JoinMultipleChannel.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/JoinMultipleChannel.java index 2ba5466a6..aa49c39d9 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/JoinMultipleChannel.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/JoinMultipleChannel.java @@ -47,6 +47,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Join multiple channel. + */ @Example( index = 13, group = ADVANCED, @@ -97,22 +100,22 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -120,7 +123,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -147,7 +150,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); engine.stopPreview(); @@ -174,14 +177,13 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channel1); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -229,7 +231,7 @@ public void onClick(View v) { private void takeSnapshotEx() { int remoteUid = fl_remote2.getReportUid(); - if( remoteUid <= 0 || !joinedEx){ + if (remoteUid <= 0 || !joinedEx) { showLongToast(getString(R.string.remote_screenshot_tip)); return; } @@ -259,7 +261,7 @@ private void joinChannel(String channelId) { // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -273,13 +275,13 @@ private void joinChannel(String channelId) { VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -365,7 +367,7 @@ public void onLocalAudioStateChanged(int state, int error) { super.onLocalAudioStateChanged(state, error); Log.i(TAG, String.format("onLocalAudioStateChanged state:%d!", state)); showLongToast(String.format(Locale.US, "onLocalAudioStateChanged state:%d!", state)); - if(state == Constants.LOCAL_AUDIO_STREAM_STATE_STOPPED){ + if (state == Constants.LOCAL_AUDIO_STREAM_STATE_STOPPED) { runOnUIThread(() -> fl_local.setLocalAudioStats(new LocalAudioStats())); showAlert(getString(R.string.microphone_stop_tip)); } @@ -387,7 +389,7 @@ public void onLocalVideoStats(Constants.VideoSourceType source, LocalVideoStats public void onRemoteAudioStats(RemoteAudioStats stats) { super.onRemoteAudioStats(stats); runOnUIThread(() -> { - if(fl_remote2.getReportUid() == stats.uid){ + if (fl_remote2.getReportUid() == stats.uid) { fl_remote2.setRemoteAudioStats(stats); } }); @@ -397,7 +399,7 @@ public void onRemoteAudioStats(RemoteAudioStats stats) { public void onRemoteVideoStats(RemoteVideoStats stats) { super.onRemoteVideoStats(stats); runOnUIThread(() -> { - if(fl_remote2.getReportUid() == stats.uid){ + if (fl_remote2.getReportUid() == stats.uid) { fl_remote2.setRemoteVideoStats(stats); } }); @@ -414,14 +416,13 @@ public void onRemoteVideoStats(RemoteVideoStats stats) { public void onUserJoined(int uid, int elapsed) { Log.i(TAG, "channel2 onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - runOnUIThread(() -> - { - /**Display remote video stream*/ + runOnUIThread(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; if (fl_remote2.getChildCount() > 0) { fl_remote2.removeAllViews(); @@ -606,7 +607,7 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse public void onRemoteAudioStats(RemoteAudioStats stats) { super.onRemoteAudioStats(stats); runOnUIThread(() -> { - if(fl_remote.getReportUid() == stats.uid){ + if (fl_remote.getReportUid() == stats.uid) { fl_remote.setRemoteAudioStats(stats); } }); @@ -616,7 +617,7 @@ public void onRemoteAudioStats(RemoteAudioStats stats) { public void onRemoteVideoStats(RemoteVideoStats stats) { super.onRemoteVideoStats(stats); runOnUIThread(() -> { - if(fl_remote.getReportUid() == stats.uid){ + if (fl_remote.getReportUid() == stats.uid) { fl_remote.setRemoteVideoStats(stats); } }); @@ -634,14 +635,13 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); @@ -678,7 +678,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/KtvCopyrightMusic.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/KtvCopyrightMusic.java index e65e06db0..a5316ae4c 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/KtvCopyrightMusic.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/KtvCopyrightMusic.java @@ -2,10 +2,15 @@ import static io.agora.api.example.common.model.Examples.ADVANCED; +import java.util.Locale; + import io.agora.api.example.R; import io.agora.api.example.annotation.Example; import io.agora.api.example.common.BaseBrowserFragment; +/** + * The type Ktv copyright music. + */ @Example( index = 24, group = ADVANCED, @@ -17,7 +22,10 @@ public class KtvCopyrightMusic extends BaseBrowserFragment { @Override protected String getBrowserUrl() { - return "https://docs.agora.io/cn/online-ktv/downloads?platform=All%20Platforms&from_wecom=1"; + if (getResources().getConfiguration().locale.getLanguage() == Locale.CHINESE.getLanguage()) { + return "https://doc.shengwang.cn/doc/online-ktv/android/landing-page"; + } + return "https://docs.agora.io/en/interactive-live-streaming/overview/product-overview?platform=android"; } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LiveStreaming.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LiveStreaming.java index b00112308..91a456ded 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LiveStreaming.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LiveStreaming.java @@ -147,11 +147,9 @@ public void onNothingSelected(AdapterView parent) { public void onItemSelected(AdapterView parent, View view, int position, long id) { if (getString(R.string.render_mode_hidden).equals(parent.getSelectedItem())) { canvasRenderMode = Constants.RENDER_MODE_HIDDEN; - } - else if(getString(R.string.render_mode_fit).equals(parent.getSelectedItem())){ + } else if (getString(R.string.render_mode_fit).equals(parent.getSelectedItem())) { canvasRenderMode = Constants.RENDER_MODE_FIT; - } - else if(getString(R.string.render_mode_adaptive).equals(parent.getSelectedItem())){ + } else if (getString(R.string.render_mode_adaptive).equals(parent.getSelectedItem())) { canvasRenderMode = Constants.RENDER_MODE_ADAPTIVE; } updateVideoView(); @@ -181,7 +179,7 @@ public void onStopTrackingTouch(SeekBar seekBar) { } }); mSettingBinding.switchVideoImage.setOnCheckedChangeListener((buttonView, isChecked) -> { - if(!isHost && isChecked){ + if (!isHost && isChecked) { showShortToast("Please join channel with broadcast role firstly."); buttonView.setChecked(false); return; @@ -200,20 +198,20 @@ public void onStopTrackingTouch(SeekBar seekBar) { private void updateVideoView() { - if(backGroundVideo.getChildCount() > 0 && backGroundVideo.getReportUid() != -1){ + if (backGroundVideo.getChildCount() > 0 && backGroundVideo.getReportUid() != -1) { int reportUid = backGroundVideo.getReportUid(); SurfaceView videoView = new SurfaceView(requireContext()); backGroundVideo.removeAllViews(); backGroundVideo.addView(videoView); VideoCanvas local = new VideoCanvas(videoView, canvasRenderMode, reportUid); local.backgroundColor = canvasBgColor; - if(reportUid == myUid){ + if (reportUid == myUid) { engine.setupLocalVideo(local); - }else{ + } else { engine.setupRemoteVideo(local); } } - if(foreGroundVideo.getChildCount() > 0 && foreGroundVideo.getReportUid() != -1){ + if (foreGroundVideo.getChildCount() > 0 && foreGroundVideo.getReportUid() != -1) { int reportUid = foreGroundVideo.getReportUid(); SurfaceView videoView = new SurfaceView(requireContext()); videoView.setZOrderMediaOverlay(true); @@ -221,9 +219,9 @@ private void updateVideoView() { foreGroundVideo.addView(videoView); VideoCanvas local = new VideoCanvas(videoView, canvasRenderMode, reportUid); local.backgroundColor = canvasBgColor; - if(reportUid == myUid){ + if (reportUid == myUid) { engine.setupLocalVideo(local); - }else{ + } else { engine.setupRemoteVideo(local); } } @@ -310,8 +308,7 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); @@ -330,7 +327,7 @@ public void onClick(View v) { foreGroundVideo.removeAllViews(); backGroundVideo.removeAllViews(); - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -357,7 +354,8 @@ public void onClick(View v) { engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); } else { ClientRoleOptions clientRoleOptions = new ClientRoleOptions(); - clientRoleOptions.audienceLatencyLevel = mSettingBinding.switchLowLatency.isChecked() ? Constants.AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY : Constants.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY; + clientRoleOptions.audienceLatencyLevel = mSettingBinding.switchLowLatency.isChecked() ? Constants.AUDIENCE_LATENCY_LEVEL_ULTRA_LOW_LATENCY + : Constants.AUDIENCE_LATENCY_LEVEL_LOW_LATENCY; engine.setClientRole(CLIENT_ROLE_AUDIENCE, clientRoleOptions); } mRootBinding.btnPublish.setEnabled(false); @@ -379,7 +377,7 @@ public void onClick(View v) { SurfaceView remoteView = new SurfaceView(getContext()); if (isLocalVideoForeground) { // Add to the local container - foreGroundVideo.addView(localView,0, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); + foreGroundVideo.addView(localView, 0, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); // Add to the remote container backGroundVideo.addView(remoteView, 0, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); // Setup remote video to render @@ -416,8 +414,7 @@ public void onClick(View v) { Toast.makeText(getContext(), "The channel name is empty!", Toast.LENGTH_SHORT).show(); } else { myUid = new Random().nextInt(1000) + 10000; - TokenUtils.gen(getContext(), channelName, myUid, token -> - { + TokenUtils.gen(getContext(), channelName, myUid, token -> { myToken = token; int ret = engine.preloadChannel(token, channelName, myUid); if (ret == Constants.ERR_OK) { @@ -541,9 +538,9 @@ private void setEncodingPreference(int index) { } private void enableBFrame(boolean enable) { - videoEncoderConfiguration.advanceOptions.compressionPreference = enable ? - VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_QUALITY : - VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY; + videoEncoderConfiguration.advanceOptions.compressionPreference = enable + ? VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_QUALITY + : VideoEncoderConfiguration.COMPRESSION_PREFERENCE.PREFER_LOW_LATENCY; engine.setVideoEncoderConfiguration(videoEncoderConfiguration); } @@ -674,7 +671,7 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; @@ -684,10 +681,9 @@ public void onUserJoined(int uid, int elapsed) { } else { remoteUid = uid; } - handler.post(() -> - { + handler.post(() -> { VideoReportLayout videoContainer = isLocalVideoForeground ? backGroundVideo : foreGroundVideo; - /**Display remote video stream*/ + /*Display remote video stream*/ SurfaceView surfaceView = null; if (videoContainer.getChildCount() > 0) { videoContainer.removeAllViews(); @@ -725,7 +721,7 @@ public void onUserOffline(int uid, int reason) { runOnUIThread(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ VideoCanvas remote = new VideoCanvas(null, canvasRenderMode, uid); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LocalVideoTranscoding.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LocalVideoTranscoding.java index 51cece08f..7852f5d6d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LocalVideoTranscoding.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/LocalVideoTranscoding.java @@ -46,7 +46,9 @@ import io.agora.rtc2.video.VideoEncoderConfiguration; import io.agora.rtc2.video.VirtualBackgroundSource; -/**This demo demonstrates how to make a one-to-one video call*/ +/** + * This demo demonstrates how to make a one-to-one video call + */ @Example( index = 19, group = ADVANCED, @@ -54,8 +56,7 @@ actionId = R.id.action_mainFragment_to_LocalVideoTranscoding, tipsId = R.string.localvideotranscoding ) -public class LocalVideoTranscoding extends BaseFragment implements View.OnClickListener, CompoundButton.OnCheckedChangeListener -{ +public class LocalVideoTranscoding extends BaseFragment implements View.OnClickListener, CompoundButton.OnCheckedChangeListener { private static final String TAG = LocalVideoTranscoding.class.getSimpleName(); private VideoReportLayout videoReportLayout; @@ -69,14 +70,12 @@ public class LocalVideoTranscoding extends BaseFragment implements View.OnClickL @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { return inflater.inflate(R.layout.fragment_localvideotranscoding, container, false); } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); switchTransparentBackground = view.findViewById(R.id.btn_transparent_background); @@ -87,42 +86,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -141,20 +137,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) engine.setLocalAccessPoint(localAccessPointConfiguration); } } - catch (Exception e) - { + catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); engine.stopPreview(Constants.VideoSourceType.VIDEO_SOURCE_TRANSCODED); engine.stopCameraCapture(Constants.VideoSourceType.VIDEO_SOURCE_CAMERA_PRIMARY); @@ -166,22 +159,17 @@ public void onDestroy() @SuppressLint("WrongConstant") @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission joinChannel(channelId); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -208,12 +196,10 @@ public void onClick(View v) } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } DisplayMetrics metrics = new DisplayMetrics(); @@ -221,16 +207,16 @@ private void joinChannel(String channelId) int width = 720; int height = (int) (width * 1.0f / metrics.widthPixels * metrics.heightPixels); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( new VideoEncoderConfiguration.VideoDimensions(width, height), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); // Set audio route to microPhone @@ -276,8 +262,7 @@ private void joinChannel(String channelId) // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); - if(videoReportLayout.getChildCount() > 0) - { + if (videoReportLayout.getChildCount() > 0) { videoReportLayout.removeAllViews(); } // Setup local video to render your local camera preview @@ -296,18 +281,17 @@ private void joinChannel(String channelId) option.publishMicrophoneTrack = true; option.publishTranscodedVideoTrack = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -320,16 +304,13 @@ private void joinChannel(String channelId) }); - - } /** * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { @Override public void onLocalVideoTranscoderError(LocalTranscoderConfiguration.TranscodingVideoStream stream, int error) { super.onLocalVideoTranscoderError(stream, error); @@ -351,7 +332,8 @@ public void onError(int err) { if (Constants.ERR_INVALID_TOKEN == err) { showAlert(getString(R.string.token_invalid)); - } if (Constants.ERR_TOKEN_EXPIRED == err) { + } + if (Constants.ERR_TOKEN_EXPIRED == err) { showAlert(getString(R.string.token_expired)); } } @@ -361,8 +343,7 @@ public void onError(int err) { * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -375,17 +356,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); videoReportLayout.setReportUid(uid); @@ -469,8 +447,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -492,8 +469,12 @@ public void onLocalVideoStats(Constants.VideoSourceType source, LocalVideoStats @Override public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) { - if(buttonView == switchTransparentBackground){ - engine.enableVirtualBackground(isChecked, new VirtualBackgroundSource(VirtualBackgroundSource.BACKGROUND_COLOR, Color.TRANSPARENT, "", VirtualBackgroundSource.BLUR_DEGREE_HIGH), new SegmentationProperty()); + if (buttonView == switchTransparentBackground) { + engine.enableVirtualBackground(isChecked, + new VirtualBackgroundSource(VirtualBackgroundSource.BACKGROUND_COLOR, + Color.TRANSPARENT, "", + VirtualBackgroundSource.BLUR_DEGREE_HIGH), + new SegmentationProperty()); } } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaPlayer.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaPlayer.java index 3a3eff45e..b57a7f749 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaPlayer.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaPlayer.java @@ -19,10 +19,14 @@ import android.view.SurfaceView; import android.view.View; import android.view.ViewGroup; +import android.widget.AdapterView; +import android.widget.ArrayAdapter; import android.widget.Button; import android.widget.EditText; import android.widget.FrameLayout; +import android.widget.LinearLayout; import android.widget.SeekBar; +import android.widget.Spinner; import androidx.annotation.NonNull; import androidx.annotation.Nullable; @@ -30,6 +34,9 @@ import com.yanzhenjie.permission.AndPermission; import com.yanzhenjie.permission.runtime.Permission; +import java.util.ArrayList; +import java.util.List; + import io.agora.api.example.MainApplication; import io.agora.api.example.R; import io.agora.api.example.annotation.Example; @@ -38,6 +45,10 @@ import io.agora.api.example.utils.TokenUtils; import io.agora.mediaplayer.IMediaPlayer; import io.agora.mediaplayer.IMediaPlayerObserver; +import io.agora.mediaplayer.data.CacheStatistics; +import io.agora.mediaplayer.data.MediaPlayerSource; +import io.agora.mediaplayer.data.MediaStreamInfo; +import io.agora.mediaplayer.data.PlayerPlaybackStats; import io.agora.mediaplayer.data.PlayerUpdatedInfo; import io.agora.mediaplayer.data.SrcInfo; import io.agora.rtc2.ChannelMediaOptions; @@ -49,6 +60,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Media player. + */ @Example( index = 17, group = ADVANCED, @@ -56,7 +70,7 @@ actionId = R.id.action_mainFragment_to_MediaPlayer, tipsId = R.string.mediaplayer ) -public class MediaPlayer extends BaseFragment implements View.OnClickListener, IMediaPlayerObserver { +public class MediaPlayer extends BaseFragment implements View.OnClickListener, AdapterView.OnItemSelectedListener, IMediaPlayerObserver { private static final String TAG = MediaPlayer.class.getSimpleName(); @@ -67,6 +81,10 @@ public class MediaPlayer extends BaseFragment implements View.OnClickListener, I private ChannelMediaOptions options = new ChannelMediaOptions(); private int myUid; private FrameLayout fl_local, fl_remote; + private LinearLayout ll_streams; + private Spinner sp_player_stream, sp_publish_stream; + private List mediaStreamInfoList; + private int playerStreamIndex, publishStreamIndex; private boolean joined = false; private SeekBar progressBar; @@ -90,30 +108,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -148,6 +166,11 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat stop = view.findViewById(R.id.stop); pause = view.findViewById(R.id.pause); publish = view.findViewById(R.id.publish); + ll_streams = view.findViewById(R.id.ll_streams); + sp_publish_stream = view.findViewById(R.id.sp_publish_stream); + sp_player_stream = view.findViewById(R.id.sp_player_stream); + sp_player_stream.setOnItemSelectedListener(this); + sp_publish_stream.setOnItemSelectedListener(this); progressBar = view.findViewById(R.id.ctrl_progress_bar); progressBar.setMax(100); @@ -199,14 +222,13 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -233,7 +255,10 @@ public void onClick(View v) { } else if (v.getId() == R.id.open) { String url = et_url.getText().toString(); if (!TextUtils.isEmpty(url)) { - mediaPlayer.open(url, 0); + MediaPlayerSource source = new MediaPlayerSource(); + source.setUrl(url); + source.setEnableMultiAudioTrack(true); + mediaPlayer.openWithMediaSource(source); } } else if (v.getId() == R.id.play) { mediaPlayer.play(); @@ -250,6 +275,24 @@ public void onClick(View v) { } } + @Override + public void onItemSelected(AdapterView parent, View view, int position, long id) { + if (parent == sp_player_stream) { + MediaStreamInfo mediaStreamInfo = mediaStreamInfoList.get(sp_player_stream.getSelectedItemPosition()); + playerStreamIndex = mediaStreamInfo.getStreamIndex(); + mediaPlayer.selectMultiAudioTrack(playerStreamIndex, publishStreamIndex); + } else if (parent == sp_publish_stream) { + MediaStreamInfo mediaStreamInfo = mediaStreamInfoList.get(sp_publish_stream.getSelectedItemPosition()); + publishStreamIndex = mediaStreamInfo.getStreamIndex(); + mediaPlayer.selectMultiAudioTrack(playerStreamIndex, publishStreamIndex); + } + } + + @Override + public void onNothingSelected(AdapterView parent) { + + } + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); @@ -257,7 +300,7 @@ private void joinChannel(String channelId) { return; } - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -276,14 +319,14 @@ private void joinChannel(String channelId) { } fl_local.addView(surfaceView); // Setup local video to render your local media player view - VideoCanvas videoCanvas = new VideoCanvas(surfaceView, Constants.RENDER_MODE_HIDDEN, 0); + VideoCanvas videoCanvas = new VideoCanvas(surfaceView, Constants.RENDER_MODE_HIDDEN, 0); videoCanvas.sourceType = Constants.VIDEO_SOURCE_MEDIA_PLAYER; videoCanvas.mediaPlayerId = mediaPlayer.getMediaPlayerId(); engine.setupLocalVideo(videoCanvas); // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // set options options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; @@ -294,13 +337,13 @@ private void joinChannel(String channelId) { options.publishMicrophoneTrack = false; options.enableAudioRecordingOrPlayout = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, options); if (res != 0) { @@ -451,14 +494,13 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); @@ -491,7 +533,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -503,7 +545,7 @@ public void run() { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (mediaPlayer != null) { mediaPlayer.unRegisterPlayerObserver(this); mediaPlayer.destroy(); @@ -517,23 +559,52 @@ public void onDestroy() { } private void setMediaPlayerViewEnable(boolean enable) { - handler.post(new Runnable() { - @Override - public void run() { - play.setEnabled(enable); - stop.setEnabled(enable); - pause.setEnabled(enable); - publish.setEnabled(enable); + runOnUIThread(() -> { + play.setEnabled(enable); + stop.setEnabled(enable); + pause.setEnabled(enable); + publish.setEnabled(enable); + + if (enable) { + ll_streams.setVisibility(View.VISIBLE); + mediaStreamInfoList = new ArrayList<>(); + for (int i = 0; i < mediaPlayer.getStreamCount(); i++) { + MediaStreamInfo streamInfo = mediaPlayer.getStreamInfo(i); + if (streamInfo.getMediaStreamType() + == io.agora.mediaplayer.Constants.MediaStreamType.getValue( + io.agora.mediaplayer.Constants.MediaStreamType.STREAM_TYPE_AUDIO)) { + mediaStreamInfoList.add(streamInfo); + } + } + + String[] trackNames = new String[mediaStreamInfoList.size()]; + for (int i = 0; i < mediaStreamInfoList.size(); i++) { + trackNames[i] = getString(R.string.audio_stream_index, i); + } + sp_player_stream.setAdapter( + new ArrayAdapter(requireContext(), android.R.layout.simple_spinner_dropdown_item, + android.R.id.text1, trackNames) + ); + sp_publish_stream.setAdapter( + new ArrayAdapter(requireContext(), android.R.layout.simple_spinner_dropdown_item, + android.R.id.text1, trackNames) + ); + if (mediaStreamInfoList.size() > 0) { + playerStreamIndex = mediaStreamInfoList.get(0).getStreamIndex(); + publishStreamIndex = mediaStreamInfoList.get(0).getStreamIndex(); + } + } else { + ll_streams.setVisibility(View.GONE); } }); } @Override - public void onPlayerStateChanged(io.agora.mediaplayer.Constants.MediaPlayerState mediaPlayerState, io.agora.mediaplayer.Constants.MediaPlayerError mediaPlayerError) { - Log.e(TAG, "onPlayerStateChanged mediaPlayerState " + mediaPlayerState + ", error=" + mediaPlayerError); - if (mediaPlayerState.equals(PLAYER_STATE_OPEN_COMPLETED)) { + public void onPlayerStateChanged(io.agora.mediaplayer.Constants.MediaPlayerState state, io.agora.mediaplayer.Constants.MediaPlayerReason reason) { + Log.e(TAG, "onPlayerStateChanged mediaPlayerState " + state + ", reason=" + reason); + if (state.equals(PLAYER_STATE_OPEN_COMPLETED)) { setMediaPlayerViewEnable(true); - } else if (mediaPlayerState.equals(PLAYER_STATE_IDLE) || mediaPlayerState.equals(PLAYER_STATE_STOPPED) || mediaPlayerState.equals(PLAYER_STATE_PLAYBACK_COMPLETED) ) { + } else if (state.equals(PLAYER_STATE_IDLE) || state.equals(PLAYER_STATE_STOPPED) || state.equals(PLAYER_STATE_PLAYBACK_COMPLETED)) { setMediaPlayerViewEnable(false); options.publishMediaPlayerVideoTrack = false; options.publishMediaPlayerAudioTrack = false; @@ -542,10 +613,10 @@ public void onPlayerStateChanged(io.agora.mediaplayer.Constants.MediaPlayerState } @Override - public void onPositionChanged(long position) { - Log.e(TAG, "onPositionChanged position " + position); + public void onPositionChanged(long positionMs, long timestampMs) { + Log.e(TAG, "onPositionChanged position " + positionMs); if (playerDuration > 0) { - final int result = (int) ((float) position / (float) playerDuration * 100); + final int result = (int) ((float) positionMs / (float) playerDuration * 100); handler.post(new Runnable() { @Override public void run() { @@ -589,6 +660,16 @@ public void onPlayerInfoUpdated(PlayerUpdatedInfo playerUpdatedInfo) { } + @Override + public void onPlayerCacheStats(CacheStatistics stats) { + + } + + @Override + public void onPlayerPlaybackStats(PlayerPlaybackStats stats) { + + } + @Override public void onAudioVolumeIndication(int i) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaRecorder.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaRecorder.java index c1b408904..6f77922e1 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaRecorder.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MediaRecorder.java @@ -55,6 +55,10 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; + +/** + * The type Media recorder. + */ @Example( index = 17, group = ADVANCED, @@ -107,22 +111,22 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -130,7 +134,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -158,7 +162,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { public void onDestroy() { super.onDestroy(); stopAllMediaRecorder(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -194,15 +198,14 @@ public void onClick(View v) { // Request permission AndPermission.with(this).runtime().permission( permissionArray - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; stopAllMediaRecorder(); - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -256,7 +259,7 @@ private void joinChannel(String channelId) { // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -274,14 +277,14 @@ private void joinChannel(String channelId) { option.publishMicrophoneTrack = true; option.publishCameraTrack = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -297,7 +300,7 @@ private void joinChannel(String channelId) { }); } - private void stopAllMediaRecorder(){ + private void stopAllMediaRecorder() { stopLocalMediaRecorder(); Set remoteUidList = remoteMediaRecorders.keySet(); for (Integer uid : remoteUidList) { @@ -307,7 +310,7 @@ private void stopAllMediaRecorder(){ private void stopRemoteMediaRecorder(int uid) { AgoraMediaRecorder mediaRecorder = remoteMediaRecorders.get(uid); - if(mediaRecorder == null){ + if (mediaRecorder == null) { return; } // Stop Local Recording @@ -331,12 +334,12 @@ public void onRecorderStateChanged(String channelId, int uid, int state, int err Log.d(TAG, "RemoteMediaRecorder -- onRecorderStateChanged channelId=" + channelId + ", uid=" + uid + ", state=" + state + ", error=" + error); if (state == AgoraMediaRecorder.RECORDER_STATE_STOP) { showRecordMediaPathDialog(storagePath); - } else if (state == AgoraMediaRecorder.RECORDER_STATE_ERROR && error == AgoraMediaRecorder.RECORDER_ERROR_CONFIG_CHANGED) { + } else if (state == AgoraMediaRecorder.RECORDER_STATE_ERROR && error == AgoraMediaRecorder.RECORDER_REASON_CONFIG_CHANGED) { // switch camera while recording runOnUIThread(() -> { VideoReportLayout userView = getUserView(uid); - if(userView != null){ - Button btnRecording = ((ViewGroup)userView.getParent()).findViewWithTag(getString(R.string.recording_tag)); + if (userView != null) { + Button btnRecording = ((ViewGroup) userView.getParent()).findViewWithTag(getString(R.string.recording_tag)); btnRecording.setText(R.string.start_recording); } stopRemoteMediaRecorder(uid); @@ -346,7 +349,9 @@ public void onRecorderStateChanged(String channelId, int uid, int state, int err @Override public void onRecorderInfoUpdated(String channelId, int uid, RecorderInfo info) { - Log.d(TAG, "RemoteMediaRecorder -- onRecorderInfoUpdated channelId=" + channelId + ", uid=" + uid + ", fileName=" + info.fileName + ", durationMs=" + info.durationMs + ", fileSize=" + info.fileSize); + Log.d(TAG, "RemoteMediaRecorder -- onRecorderInfoUpdated channelId=" + + channelId + ", uid=" + uid + ", fileName=" + info.fileName + + ", durationMs=" + info.durationMs + ", fileSize=" + info.fileSize); } }); remoteMediaRecorders.put(uid, mediaRecorder); @@ -359,7 +364,7 @@ public void onRecorderInfoUpdated(String channelId, int uid, RecorderInfo info) } private void stopLocalMediaRecorder() { - if(localMediaRecorder == null){ + if (localMediaRecorder == null) { return; } // Stop Local Recording @@ -383,12 +388,12 @@ public void onRecorderStateChanged(String channelId, int uid, int state, int err Log.d(TAG, "LocalMediaRecorder -- onRecorderStateChanged channelId=" + channelId + ", uid=" + uid + ", state=" + state + ", error=" + error); if (state == AgoraMediaRecorder.RECORDER_STATE_STOP) { showRecordMediaPathDialog(storagePath); - } else if (state == AgoraMediaRecorder.RECORDER_STATE_ERROR && error == AgoraMediaRecorder.RECORDER_ERROR_CONFIG_CHANGED) { + } else if (state == AgoraMediaRecorder.RECORDER_STATE_ERROR && error == AgoraMediaRecorder.RECORDER_REASON_CONFIG_CHANGED) { // switch camera while recording runOnUIThread(() -> { VideoReportLayout userView = fl_local; - if(userView != null){ - Button btnRecording = ((ViewGroup)userView.getParent()).findViewWithTag(getString(R.string.recording_tag)); + if (userView != null) { + Button btnRecording = ((ViewGroup) userView.getParent()).findViewWithTag(getString(R.string.recording_tag)); btnRecording.setText(R.string.start_recording); } stopLocalMediaRecorder(); @@ -398,7 +403,9 @@ public void onRecorderStateChanged(String channelId, int uid, int state, int err @Override public void onRecorderInfoUpdated(String channelId, int uid, RecorderInfo info) { - Log.d(TAG, "LocalMediaRecorder -- onRecorderInfoUpdated channelId=" + channelId + ", uid=" + uid + ", fileName=" + info.fileName + ", durationMs=" + info.durationMs + ", fileSize=" + info.fileSize); + Log.d(TAG, "LocalMediaRecorder -- onRecorderInfoUpdated channelId=" + + channelId + ", uid=" + uid + ", fileName=" + info.fileName + + ", durationMs=" + info.durationMs + ", fileSize=" + info.fileSize); } }); } @@ -410,7 +417,7 @@ public void onRecorderInfoUpdated(String channelId, int uid, RecorderInfo info) } private void setupLayoutRecording(@NonNull ViewGroup reportLayout, @NonNull Runnable onStart, @NonNull Runnable onStop) { - Button btnRecording = ((ViewGroup)reportLayout.getParent()).findViewWithTag(getString(R.string.recording_tag)); + Button btnRecording = ((ViewGroup) reportLayout.getParent()).findViewWithTag(getString(R.string.recording_tag)); if (btnRecording == null) { return; } @@ -429,7 +436,7 @@ private void setupLayoutRecording(@NonNull ViewGroup reportLayout, @NonNull Runn }); } - private void showRecordMediaPathDialog(String path){ + private void showRecordMediaPathDialog(String path) { runOnUIThread(() -> { new AlertDialog.Builder(requireContext()) .setTitle("MediaFilePath") @@ -440,7 +447,7 @@ private void showRecordMediaPathDialog(String path){ } private void resetLayoutRecording(@NonNull ViewGroup reportLayout) { - Button btnRecording = ((ViewGroup)reportLayout.getParent()).findViewWithTag(getString(R.string.recording_tag)); + Button btnRecording = ((ViewGroup) reportLayout.getParent()).findViewWithTag(getString(R.string.recording_tag)); if (btnRecording == null) { return; } @@ -598,7 +605,7 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; @@ -606,9 +613,8 @@ public void onUserJoined(int uid, int elapsed) { if (remoteViews.containsKey(uid)) { return; } else { - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; // Create render view by RtcEngine surfaceView = new SurfaceView(context); @@ -642,7 +648,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -696,7 +702,7 @@ private VideoReportLayout getAvailableView() { } } - private VideoReportLayout getUserView(int uid){ + private VideoReportLayout getUserView(int uid) { VideoReportLayout[] layouts = new VideoReportLayout[]{fl_remote, fl_remote_2, fl_remote_3}; for (VideoReportLayout layout : layouts) { if (layout.getReportUid() == uid) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MultiVideoSourceTracks.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MultiVideoSourceTracks.java index bc82dbca6..627300696 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MultiVideoSourceTracks.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/MultiVideoSourceTracks.java @@ -64,6 +64,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Multi video source tracks. + */ @Example( index = 10, group = ADVANCED, @@ -127,22 +130,22 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -150,7 +153,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -176,7 +179,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { destroyAllPushingVideoTrack(); engine.leaveChannel(); @@ -205,8 +208,7 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); @@ -230,10 +232,10 @@ private void joinChannel(String channelId) { return; } - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enables the video module. engine.enableVideo(); @@ -245,13 +247,13 @@ private void joinChannel(String channelId) { VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -276,11 +278,11 @@ private void joinChannel(String channelId) { * Push video frame by i420. * * @param trackId video track id. - * @param yuv i420 data - * @param width width - * @param height height + * @param yuv i420 data + * @param width width + * @param height height */ - private void pushVideoFrameByI420(int trackId, byte[] yuv, int width, int height){ + private void pushVideoFrameByI420(int trackId, byte[] yuv, int width, int height) { JavaI420Buffer i420Buffer = JavaI420Buffer.allocate(width, height); i420Buffer.getDataY().put(yuv, 0, i420Buffer.getDataY().limit()); i420Buffer.getDataU().put(yuv, i420Buffer.getDataY().limit(), i420Buffer.getDataU().limit()); @@ -299,7 +301,7 @@ private void pushVideoFrameByI420(int trackId, byte[] yuv, int width, int height /* * Pushes the external video frame to the app. */ - int ret = engine.pushExternalVideoFrameEx(videoFrame, trackId); + int ret = engine.pushExternalVideoFrameById(videoFrame, trackId); i420Buffer.release(); @@ -312,11 +314,11 @@ private void pushVideoFrameByI420(int trackId, byte[] yuv, int width, int height * Push video frame by nv21. * * @param trackId video track id. - * @param nv21 nv21 - * @param width width - * @param height height + * @param nv21 nv21 + * @param width width + * @param height height */ - private void pushVideoFrameByNV21(int trackId, byte[] nv21, int width, int height){ + private void pushVideoFrameByNV21(int trackId, byte[] nv21, int width, int height) { VideoFrame.Buffer frameBuffer = new NV21Buffer(nv21, width, height, null); @@ -333,7 +335,7 @@ private void pushVideoFrameByNV21(int trackId, byte[] nv21, int width, int heigh /* * Pushes the external video frame to the app. */ - int ret = engine.pushExternalVideoFrameEx(videoFrame, trackId); + int ret = engine.pushExternalVideoFrameById(videoFrame, trackId); if (ret != Constants.ERR_OK) { Log.w(TAG, "pushExternalVideoFrame error"); @@ -344,11 +346,11 @@ private void pushVideoFrameByNV21(int trackId, byte[] nv21, int width, int heigh * Push video frame by nv12. * * @param trackId video track id. - * @param nv12 nv12 buffer. - * @param width width. - * @param height height. + * @param nv12 nv12 buffer. + * @param width width. + * @param height height. */ - private void pushVideoFrameByNV12(int trackId, ByteBuffer nv12, int width, int height){ + private void pushVideoFrameByNV12(int trackId, ByteBuffer nv12, int width, int height) { VideoFrame.Buffer frameBuffer = new NV12Buffer(width, height, width, height, nv12, null); /* @@ -364,7 +366,7 @@ private void pushVideoFrameByNV12(int trackId, ByteBuffer nv12, int width, int h /* * Pushes the external video frame to the app. */ - int ret = engine.pushExternalVideoFrameEx(videoFrame, trackId); + int ret = engine.pushExternalVideoFrameById(videoFrame, trackId); if (ret != Constants.ERR_OK) { Log.w(TAG, "pushExternalVideoFrame error"); @@ -374,14 +376,14 @@ private void pushVideoFrameByNV12(int trackId, ByteBuffer nv12, int width, int h /** * Push video frame by texture id. * - * @param trackId video track id. - * @param textureId texture id. + * @param trackId video track id. + * @param textureId texture id. * @param textureType texture type. rgb or oes. - * @param width width. - * @param height height. + * @param width width. + * @param height height. */ @GLThread - private void pushVideoFrameByTexture(int trackId, int textureId, VideoFrame.TextureBuffer.Type textureType, int width, int height){ + private void pushVideoFrameByTexture(int trackId, int textureId, VideoFrame.TextureBuffer.Type textureType, int width, int height) { VideoFrame.Buffer frameBuffer = new TextureBuffer( EglBaseProvider.getCurrentEglContext(), width, @@ -407,7 +409,7 @@ private void pushVideoFrameByTexture(int trackId, int textureId, VideoFrame.Text /* * Pushes the external video frame to the app. */ - int ret = engine.pushExternalVideoFrameEx(videoFrame, trackId); + int ret = engine.pushExternalVideoFrameById(videoFrame, trackId); if (ret != Constants.ERR_OK) { Log.w(TAG, "pushExternalVideoFrame error"); @@ -606,7 +608,7 @@ private void createPushingEncodedVidTrack() { frameInfo.codecType = Constants.VIDEO_CODEC_H264; frameInfo.framesPerSecond = frameRate; frameInfo.frameType = isKeyFrame ? Constants.VIDEO_FRAME_TYPE_KEY_FRAME : Constants.VIDEO_FRAME_TYPE_DELTA_FRAME; - int ret = engine.pushExternalEncodedVideoFrameEx(buffer, frameInfo, videoTrack); + int ret = engine.pushExternalEncodedVideoFrameById(buffer, frameInfo, videoTrack); if (ret != Constants.ERR_OK) { Log.e(TAG, "pushExternalEncodedVideoFrame error: " + ret); } @@ -633,9 +635,9 @@ private int destroyLastPushingVideoTrack() { RtcConnection connection = connections.remove(lastIndex); Iterator fileReaderIterator = videoFileReaders.iterator(); - while (fileReaderIterator.hasNext()){ + while (fileReaderIterator.hasNext()) { VideoFileReader next = fileReaderIterator.next(); - if(next.getTrackId() == videoTrack){ + if (next.getTrackId() == videoTrack) { next.stop(); /* * destroy a created custom video track id @@ -652,9 +654,9 @@ private int destroyLastPushingVideoTrack() { } Iterator extractorIterator = videoExtractorThreads.iterator(); - while (extractorIterator.hasNext()){ + while (extractorIterator.hasNext()) { MediaExtractorThread next = extractorIterator.next(); - if(next.getTrackId() == videoTrack){ + if (next.getTrackId() == videoTrack) { next.stop(); /* * destroy a created custom video track id @@ -787,14 +789,13 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ View videoView = createVideoView(uid); // Setup remote video to render engine.setupRemoteVideo(new VideoCanvas(videoView, RENDER_MODE_FIT, uid)); @@ -816,7 +817,7 @@ public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); runOnUIThread(() -> { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ resetVideoLayout(uid); @@ -844,6 +845,17 @@ public void onRemoteAudioStats(RemoteAudioStats stats) { private interface MediaExtractorCallback { + /** + * On extract frame. + * + * @param buffer the buffer + * @param presentationTimeUs the presentation time us + * @param size the size + * @param isKeyFrame the is key frame + * @param width the width + * @param height the height + * @param frameRate the frame rate + */ void onExtractFrame(ByteBuffer buffer, long presentationTimeUs, int size, boolean isKeyFrame, int width, int height, int frameRate); } @@ -861,6 +873,11 @@ private MediaExtractorThread(int trackId, String path, MediaExtractorCallback ca this.callback = callback; } + /** + * Gets track id. + * + * @return the track id + */ public int getTrackId() { return trackId; } @@ -983,8 +1000,8 @@ private void releaseTextureBuffer() { * Yuv 2 texture id. * Run on gl thread. * - * @param yuv yuv - * @param width width + * @param yuv yuv + * @param width width * @param height heigh * @return rgba texture id */ @@ -999,8 +1016,8 @@ private int yuv2texture(byte[] yuv, int width, int height) { /** * Transform yuv to nv12 * - * @param yuv yuv - * @param width width + * @param yuv yuv + * @param width width * @param height height * @return nv12 */ @@ -1033,8 +1050,8 @@ private static ByteBuffer yuv2nv12(byte[] yuv, int width, int height) { /** * Transform yuv to nv21. * - * @param yuv yuv - * @param width width + * @param yuv yuv + * @param width width * @param height height * @return nv21 */ diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PictureInPicture.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PictureInPicture.java index 9e656a65b..0ea3c8267 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PictureInPicture.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PictureInPicture.java @@ -50,13 +50,7 @@ /** * This demo demonstrates how to make a one-to-one video call */ -@Example( - index = 11, - group = ADVANCED, - name = R.string.item_picture_in_picture, - actionId = R.id.action_mainFragment_to_picture_in_picture, - tipsId = R.string.picture_in_picture -) +@Example(index = 11, group = ADVANCED, name = R.string.item_picture_in_picture, actionId = R.id.action_mainFragment_to_picture_in_picture, tipsId = R.string.picture_in_picture) public class PictureInPicture extends BaseFragment implements View.OnClickListener { private static final String TAG = PictureInPicture.class.getSimpleName(); @@ -99,22 +93,22 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -122,7 +116,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -149,7 +143,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -184,16 +178,13 @@ public void onClick(View v) { return; } // Request permission - AndPermission.with(this).runtime().permission( - permissionArray - ).onGranted(permissions -> - { + AndPermission.with(this).runtime().permission(permissionArray).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -239,7 +230,7 @@ private void joinChannel(String channelId) { // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -257,14 +248,14 @@ private void joinChannel(String channelId) { option.publishMicrophoneTrack = true; option.publishCameraTrack = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -418,7 +409,7 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; @@ -426,9 +417,8 @@ public void onUserJoined(int uid, int elapsed) { if (fl_remote.getReportUid() > 0) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ TextureView surfaceView = null; // Create render view by RtcEngine surfaceView = new TextureView(context); @@ -457,7 +447,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -518,7 +508,7 @@ private void showFloatWindow() { } private void dismissFloatWindow() { - if(!isFloatWindowShowing()){ + if (!isFloatWindowShowing()) { return; } FrameLayout container = floatWindowView.findViewById(R.id.fl_container); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PlayAudioFiles.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PlayAudioFiles.java index 664e791cf..947e2024f 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PlayAudioFiles.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PlayAudioFiles.java @@ -41,6 +41,9 @@ import io.agora.rtc2.RtcEngineConfig; import io.agora.rtc2.proxy.LocalAccessPointConfiguration; +/** + * The type Play audio files. + */ @Example( index = 15, group = ADVANCED, @@ -66,23 +69,20 @@ public class PlayAudioFiles extends BaseFragment implements View.OnClickListener private AudioSeatManager audioSeatManager; @Override - public void onCreate(@Nullable Bundle savedInstanceState) - { + public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); handler = new Handler(); } @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_play_audio_files, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); @@ -130,7 +130,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat resetLayoutByJoin(); } - private void resetLayoutByJoin(){ + private void resetLayoutByJoin() { audioProfile.setEnabled(!joined); mixingStart.setClickable(joined); @@ -149,42 +149,39 @@ private void resetLayoutByJoin(){ } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -203,9 +200,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) engine.setLocalAccessPoint(localAccessPointConfiguration); } preloadAudioEffect(); - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -215,7 +210,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) * To ensure smooth communication, limit the size of the audio effect file. * We recommend using this method to preload the audio effect before calling the joinChannel method. */ - private void preloadAudioEffect(){ + private void preloadAudioEffect() { // Gets the global audio effect manager. audioEffectManager = engine.getAudioEffectManager(); // Preloads the audio effect (recommended). Note the file size, and preload the file before joining the channel. @@ -225,12 +220,10 @@ private void preloadAudioEffect(){ } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -250,18 +243,14 @@ public void onNothingSelected(AdapterView parent) { } @Override - public void onClick(View v) - { - if (v == join) - { - if (!joined) - { + public void onClick(View v) { + if (v == join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -269,16 +258,13 @@ public void onClick(View v) AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -300,54 +286,41 @@ public void onClick(View v) resetLayoutByJoin(); audioSeatManager.downAllSeats(); } - } - else if (v == mixingStart) - { + } else if (v == mixingStart) { int ret = engine.startAudioMixing(Constant.MIX_FILE_PATH, false, -1, 0); Log.i(TAG, "startAudioMixing >> ret=" + ret); - } - else if (v == mixingResume) - { + } else if (v == mixingResume) { int ret = engine.resumeAudioMixing(); Log.i(TAG, "resumeAudioMixing >> ret=" + ret); - } - else if (v == mixingPause) - { + } else if (v == mixingPause) { int ret = engine.pauseAudioMixing(); Log.i(TAG, "pauseAudioMixing >> ret=" + ret); - } - else if (v == mixingStop) - { + } else if (v == mixingStop) { int ret = engine.stopAudioMixing(); Log.i(TAG, "stopAudioMixing >> ret=" + ret); - } - else if (v == effectStart) - { - /** Plays an audio effect file. + } else if (v == effectStart) { + /*Plays an audio effect file. * Returns * 0: Success. * < 0: Failure. */ int playRet = audioEffectManager.playEffect( - EFFECT_SOUND_ID, // The sound ID of the audio effect file to be played. - Constant.EFFECT_FILE_PATH, // The file path of the audio effect file. - -1, // The number of playback loops. -1 means an infinite loop. - 1, // pitch The pitch of the audio effect. The value ranges between 0.5 and 2. The default value is 1 (no change to the pitch). The lower the value, the lower the pitch. - 0.0, // Sets the spatial position of the effect. 0 means the effect shows ahead. - 100, // Sets the volume. The value ranges between 0 and 100. 100 is the original volume. - true // Sets whether to publish the audio effect. + EFFECT_SOUND_ID, // The sound ID of the audio effect file to be played. + Constant.EFFECT_FILE_PATH, // The file path of the audio effect file. + -1, // The number of playback loops. -1 means an infinite loop. + 1, // pitch The pitch of the audio effect. The value ranges between 0.5 and 2. The default value is 1 (no change to the pitch). The lower the value, the lower the pitch. + 0.0, // Sets the spatial position of the effect. 0 means the effect shows ahead. + 100, // Sets the volume. The value ranges between 0 and 100. 100 is the original volume. + true // Sets whether to publish the audio effect. ); - Log.i(TAG, "result playRet:"+ playRet); - } - else if(v == effectResume){ + Log.i(TAG, "result playRet:" + playRet); + } else if (v == effectResume) { int ret = engine.resumeEffect(EFFECT_SOUND_ID); Log.i(TAG, "resumeEffect >> ret=" + ret); - } - else if(v == effectPause){ + } else if (v == effectPause) { int ret = engine.pauseEffect(EFFECT_SOUND_ID); Log.i(TAG, "resumeEffect >> ret=" + ret); - } - else if(v == effectStop){ + } else if (v == effectStop) { int ret = engine.stopEffect(EFFECT_SOUND_ID); Log.i(TAG, "resumeEffect >> ret=" + ret); } @@ -355,30 +328,29 @@ else if(v == effectStop){ /** * @param channelId Specify the channel name that you want to join. - * Users that input the same channel name join the same channel.*/ - private void joinChannel(String channelId) - { - /**In the demo, the default is to enter as the anchor.*/ + * Users that input the same channel name join the same channel. + */ + private void joinChannel(String channelId) { + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); engine.setAudioProfile( Constants.AudioProfile.getValue(Constants.AudioProfile.valueOf(audioProfile.getSelectedItem().toString())), Constants.AudioScenario.getValue(Constants.AudioScenario.valueOf(audioScenario.getSelectedItem().toString())) ); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); option.autoSubscribeAudio = true; option.autoSubscribeVideo = true; int res = engine.joinChannel(ret, channelId, 0, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -392,18 +364,18 @@ private void joinChannel(String channelId) }); } - /**IRtcEngineEventHandler is an abstract class providing default implementation. - * The SDK uses this class to report to the app on SDK runtime events.*/ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + /** + * IRtcEngineEventHandler is an abstract class providing default implementation. + * The SDK uses this class to report to the app on SDK runtime events. + */ + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -411,8 +383,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -425,17 +396,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); resetLayoutByJoin(); @@ -513,8 +481,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); @@ -532,8 +499,7 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); runOnUIThread(() -> audioSeatManager.downSeat(uid)); @@ -552,29 +518,26 @@ public void onAudioMixingFinished() { @Override public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { - if(seekBar.getId() == R.id.mixingPublishVolBar){ - /** + if (seekBar.getId() == R.id.mixingPublishVolBar) { + /* * Adjusts the volume of audio mixing for publishing (sending to other users). * @param volume: Audio mixing volume for publishing. The value ranges between 0 and 100 (default). */ engine.adjustAudioMixingPublishVolume(progress); - } - else if(seekBar.getId() == R.id.mixingPlayoutVolBar){ - /** + } else if (seekBar.getId() == R.id.mixingPlayoutVolBar) { + /* * Adjusts the volume of audio mixing for local playback. * @param volume: Audio mixing volume for local playback. The value ranges between 0 and 100 (default). */ engine.adjustAudioMixingPlayoutVolume(progress); - } - else if(seekBar.getId() == R.id.mixingVolBar){ - /** + } else if (seekBar.getId() == R.id.mixingVolBar) { + /* * Adjusts the volume of audio mixing. * Call this method when you are in a channel. * @param volume: Audio mixing volume. The value ranges between 0 and 100 (default). */ engine.adjustAudioMixingVolume(progress); - } - else if(seekBar.getId() == R.id.effectVolBar){ + } else if (seekBar.getId() == R.id.effectVolBar) { engine.setEffectsVolume(progress); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PreCallTest.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PreCallTest.java index 6503c5b35..da8dfc969 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PreCallTest.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PreCallTest.java @@ -16,14 +16,13 @@ import androidx.annotation.Nullable; import java.util.Random; -import java.util.Timer; -import java.util.TimerTask; import io.agora.api.example.MainApplication; import io.agora.api.example.R; import io.agora.api.example.annotation.Example; import io.agora.api.example.common.BaseFragment; import io.agora.api.example.common.model.StatisticsInfo; +import io.agora.rtc2.ClientRoleOptions; import io.agora.rtc2.Constants; import io.agora.rtc2.EchoTestConfiguration; import io.agora.rtc2.IRtcEngineEventHandler; @@ -32,6 +31,9 @@ import io.agora.rtc2.internal.LastmileProbeConfig; import io.agora.rtc2.proxy.LocalAccessPointConfiguration; +/** + * The type Pre call test. + */ @Example( index = 16, group = ADVANCED, @@ -44,12 +46,10 @@ public class PreCallTest extends BaseFragment implements View.OnClickListener { private RtcEngine engine; private int myUid; - private Button btn_lastmile, btn_echo; + private Button btn_lastmile, btn_echo_audio, btn_echo_video; private StatisticsInfo statisticsInfo; private TextView lastmileQuality, lastmileResult; private static final Integer MAX_COUNT_DOWN = 8; - private int num; - private Timer echoTimer; @Override public void onCreate(@Nullable Bundle savedInstanceState) { @@ -74,30 +74,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime evepnts. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -115,8 +115,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -126,8 +125,10 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); statisticsInfo = new StatisticsInfo(); - btn_echo = view.findViewById(R.id.btn_echo); - btn_echo.setOnClickListener(this); + btn_echo_audio = view.findViewById(R.id.btn_echo); + btn_echo_video = view.findViewById(R.id.btn_echo_video); + btn_echo_audio.setOnClickListener(this); + btn_echo_video.setOnClickListener(this); btn_lastmile = view.findViewById(R.id.btn_lastmile); btn_lastmile.setOnClickListener(this); lastmileQuality = view.findViewById(R.id.lastmile_quality); @@ -142,12 +143,12 @@ public void onDestroy() { @Override public void onClick(View v) { - if (v.getId() == R.id.btn_lastmile) - { + if (v.getId() == R.id.btn_lastmile) { // Configure a LastmileProbeConfig instance. - LastmileProbeConfig config = new LastmileProbeConfig(){}; + LastmileProbeConfig config = new LastmileProbeConfig() { + }; // Probe the uplink network quality. - config.probeUplink = true; + config.probeUplink = true; // Probe the downlink network quality. config.probeDownlink = true; // The expected uplink bitrate (bps). The value range is [100000, 5000000]. @@ -158,38 +159,53 @@ public void onClick(View v) { engine.startLastmileProbeTest(config); btn_lastmile.setEnabled(false); btn_lastmile.setText("Testing ..."); - } - else if (v.getId() == R.id.btn_echo){ - num = 0; - engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); + } else if (v.getId() == R.id.btn_echo) { EchoTestConfiguration config = new EchoTestConfiguration(); config.enableVideo = false; config.enableAudio = true; config.intervalInSeconds = MAX_COUNT_DOWN; - config.channelId = (new Random().nextInt(10000) + 100000) + ""; + config.channelId = "AudioEchoTest" + (new Random().nextInt(1000) + 10000); engine.startEchoTest(config); - btn_echo.setEnabled(false); - btn_echo.setText("Recording on Microphone ..."); - echoTimer = new Timer(true); - echoTimer.schedule(new TimerTask(){ + btn_echo_audio.setEnabled(false); + btn_echo_audio.setText("Recording on Microphone ..."); + btn_echo_video.setEnabled(false); + btn_echo_audio.post(new Runnable() { + int countDownNum = 0; + + @Override public void run() { - num++; - if(num >= MAX_COUNT_DOWN * 2){ - handler.post(() -> { - btn_echo.setEnabled(true); - btn_echo.setText(R.string.start); - }); + countDownNum++; + if (countDownNum >= MAX_COUNT_DOWN * 2) { + btn_echo_video.setEnabled(true); + btn_echo_audio.setEnabled(true); + btn_echo_audio.setText(R.string.start); engine.stopEchoTest(); - echoTimer.cancel(); - } - else if(num >= MAX_COUNT_DOWN) { - handler.post(() -> btn_echo.setText("PLaying with " + (MAX_COUNT_DOWN * 2 - num) + "Seconds")); - } - else{ - handler.post(() -> btn_echo.setText("Recording with " + (MAX_COUNT_DOWN - num) + "Seconds")); + } else if (countDownNum >= MAX_COUNT_DOWN) { + btn_echo_audio.setText("PLaying with " + (MAX_COUNT_DOWN * 2 - countDownNum) + "Seconds"); + btn_echo_audio.postDelayed(this, 1000); + } else { + btn_echo_audio.setText("Recording with " + (MAX_COUNT_DOWN - countDownNum) + "Seconds"); + btn_echo_audio.postDelayed(this, 1000); } } - }, 1000, 1000); + }); + } else if (v.getId() == R.id.btn_echo_video) { + EchoTestConfiguration config = new EchoTestConfiguration(); + config.enableVideo = true; + config.view = requireView().findViewById(R.id.surfaceView); + config.enableAudio = false; + config.intervalInSeconds = MAX_COUNT_DOWN; + config.channelId = "VideoEchoTest" + (new Random().nextInt(1000) + 10000); + engine.startEchoTest(config); + btn_echo_audio.setEnabled(false); + btn_echo_video.setEnabled(false); + btn_echo_video.setText(R.string.stop); + btn_echo_video.postDelayed(() -> { + btn_echo_video.setEnabled(true); + btn_echo_audio.setEnabled(true); + btn_echo_video.setText(R.string.start); + engine.stopEchoTest(); + }, MAX_COUNT_DOWN * 2 * 1000); } } @@ -208,6 +224,13 @@ public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } + + @Override + public void onClientRoleChanged(int oldRole, int newRole, ClientRoleOptions newRoleOptions) { + super.onClientRoleChanged(oldRole, newRole, newRoleOptions); + showLongToast("onClientRoleChanged >> newRole = " + newRole); + } + /**Occurs when a user leaves the channel. * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @@ -302,7 +325,7 @@ public void onUserOffline(int uid, int reason) { * @param quality */ @Override - public void onLastmileQuality(int quality){ + public void onLastmileQuality(int quality) { statisticsInfo.setLastMileQuality(quality); updateLastMileResult(); } @@ -328,10 +351,10 @@ public void onLastmileProbeResult(LastmileProbeResult lastmileProbeResult) { private void updateLastMileResult() { handler.post(() -> { - if(statisticsInfo.getLastMileQuality() != null){ + if (statisticsInfo.getLastMileQuality() != null) { lastmileQuality.setText("Quality: " + statisticsInfo.getLastMileQuality()); } - if(statisticsInfo.getLastMileResult() != null){ + if (statisticsInfo.getLastMileResult() != null) { lastmileResult.setText(statisticsInfo.getLastMileResult()); } }); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessAudioRawData.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessAudioRawData.java index 2c4c757bd..91d007ca6 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessAudioRawData.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessAudioRawData.java @@ -69,7 +69,7 @@ public class ProcessAudioRawData extends BaseFragment implements View.OnClickLis private AudioSeatManager audioSeatManager; - private void openAudioFile(){ + private void openAudioFile() { try { inputStream = this.getResources().getAssets().open(AUDIO_FILE); } catch (IOException e) { @@ -77,7 +77,7 @@ private void openAudioFile(){ } } - private void closeAudioFile(){ + private void closeAudioFile() { try { inputStream.close(); } catch (IOException e) { @@ -85,11 +85,11 @@ private void closeAudioFile(){ } } - private byte[] readBuffer(){ + private byte[] readBuffer() { int byteSize = SAMPLES * SAMPLE_NUM_OF_CHANNEL * 2; byte[] buffer = new byte[byteSize]; try { - if(inputStream.read(buffer) < 0){ + if (inputStream.read(buffer) < 0) { inputStream.reset(); return readBuffer(); } @@ -144,29 +144,29 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -188,8 +188,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { engine.setRecordingAudioFrameParameters(SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, Constants.RAW_AUDIO_FRAME_OP_MODE_READ_WRITE, SAMPLES); engine.setPlaybackAudioFrameParameters(SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, Constants.RAW_AUDIO_FRAME_OP_MODE_READ_WRITE, SAMPLES); openAudioFile(); - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -199,7 +198,7 @@ private byte[] audioAggregate(byte[] origin, byte[] buffer) { byte[] output = new byte[buffer.length]; for (int i = 0; i < origin.length; i++) { output[i] = (byte) ((long) origin[i] / 2 + (long) buffer[i] / 2); - if(i == 2){ + if (i == 2) { Log.i(TAG, "origin :" + (int) origin[i] + " audio: " + (int) buffer[i]); } } @@ -209,8 +208,9 @@ private byte[] audioAggregate(byte[] origin, byte[] buffer) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { + engine.registerAudioFrameObserver(null); engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -234,14 +234,13 @@ public void onClick(View v) { AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -270,11 +269,11 @@ public void onClick(View v) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); engine.setDefaultAudioRoutetoSpeakerphone(true); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -303,8 +302,11 @@ private void joinChannel(String channelId) { private final IAudioFrameObserver iAudioFrameObserver = new IAudioFrameObserver() { @Override - public boolean onRecordAudioFrame(String channel, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, ByteBuffer byteBuffer, long renderTimeMs, int bufferLength) { - if(isWriteBackAudio){ + public boolean onRecordAudioFrame(String channel, int audioFrameType, + int samples, int bytesPerSample, + int channels, int samplesPerSec, + ByteBuffer byteBuffer, long renderTimeMs, int bufferLength) { + if (isWriteBackAudio) { int length = byteBuffer.remaining(); // byteBuffer.flip(); byte[] buffer = readBuffer(); @@ -319,22 +321,34 @@ public boolean onRecordAudioFrame(String channel, int audioFrameType, int sample @Override - public boolean onPlaybackAudioFrame(String channel, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, ByteBuffer byteBuffer, long renderTimeMs, int bufferLength) { + public boolean onPlaybackAudioFrame(String channel, int audioFrameType, + int samples, int bytesPerSample, + int channels, int samplesPerSec, + ByteBuffer byteBuffer, long renderTimeMs, + int bufferLength) { return false; } @Override - public boolean onMixedAudioFrame(String channel, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, ByteBuffer byteBuffer, long renderTimeMs, int bufferLength) { + public boolean onMixedAudioFrame(String channel, int audioFrameType, + int samples, int bytesPerSample, int channels, + int samplesPerSec, ByteBuffer byteBuffer, + long renderTimeMs, int bufferLength) { return false; } @Override - public boolean onEarMonitoringAudioFrame(int type, int samplesPerChannel, int bytesPerSample, int channels, int samplesPerSec, ByteBuffer buffer, long renderTimeMs, int avsync_type) { + public boolean onEarMonitoringAudioFrame(int type, int samplesPerChannel, int bytesPerSample, + int channels, int samplesPerSec, + ByteBuffer buffer, long renderTimeMs, int avsyncType) { return false; } @Override - public boolean onPlaybackAudioFrameBeforeMixing(String channel, int uid, int audioFrameType, int samples, int bytesPerSample, int channels, int samplesPerSec, ByteBuffer byteBuffer, long renderTimeMs, int bufferLength) { + public boolean onPlaybackAudioFrameBeforeMixing(String channel, int uid, int audioFrameType, + int samples, int bytesPerSample, int channels, + int samplesPerSec, ByteBuffer byteBuffer, + long renderTimeMs, int bufferLength) { return false; } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessRawData.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessRawData.java index e2fa732e7..e516a89cd 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessRawData.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ProcessRawData.java @@ -53,6 +53,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Process raw data. + */ @Example( index = 11, group = ADVANCED, @@ -82,30 +85,30 @@ public void onCreate(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -123,8 +126,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -156,8 +158,9 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { + engine.registerVideoFrameObserver(null); engine.leaveChannel(); engine.stopPreview(); } @@ -183,14 +186,14 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + engine.registerVideoFrameObserver(null); + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -211,9 +214,7 @@ public void onClick(View v) { engine.stopPreview(); join.setText(getString(R.string.join)); } - } - else if(v.getId() == R.id.btn_snapshot) - { + } else if (v.getId() == R.id.btn_snapshot) { isSnapshot = true; } } @@ -232,16 +233,16 @@ private void joinChannel(String channelId) { // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); engine.registerVideoFrameObserver(iVideoFrameObserver); @@ -249,13 +250,13 @@ private void joinChannel(String channelId) { engine.startPreview(); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, token -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -279,7 +280,7 @@ private void joinChannel(String channelId) { private final IVideoFrameObserver iVideoFrameObserver = new IVideoFrameObserver() { @Override public boolean onCaptureVideoFrame(int sourceType, VideoFrame videoFrame) { - Log.i(TAG, "OnEncodedVideoImageReceived"+Thread.currentThread().getName()); + Log.i(TAG, "OnEncodedVideoImageReceived" + Thread.currentThread().getName()); long startTime = System.currentTimeMillis(); VideoFrame.Buffer buffer = videoFrame.getBuffer(); @@ -304,7 +305,7 @@ public boolean onCaptureVideoFrame(int sourceType, VideoFrame videoFrame) { // device: HUAWEI DUB-AL00 // consume time: 11ms, 8ms, 10ms, 10ms, 9ms, 10ms int nv21MinSize = (int) ((width * height * 3 + 1) / 2.0f); - if(videoNV21Buffer == null || videoNV21Buffer.capacity() < nv21MinSize){ + if (videoNV21Buffer == null || videoNV21Buffer.capacity() < nv21MinSize) { videoNV21Buffer = ByteBuffer.allocateDirect(nv21MinSize); videoNV21 = new byte[nv21MinSize]; } @@ -321,10 +322,10 @@ public boolean onCaptureVideoFrame(int sourceType, VideoFrame videoFrame) { // Release the buffer! i420Buffer.release(); - if(isSnapshot){ + if (isSnapshot) { isSnapshot = false; - Bitmap bitmap = YUVUtils.NV21ToBitmap(getContext(), + Bitmap bitmap = YUVUtils.nv21ToBitmap(getContext(), nv21, width, height); @@ -439,15 +440,13 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); @@ -479,7 +478,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -488,16 +487,23 @@ public void run() { } }; - public void saveBitmap2Gallery(Bitmap bm){ + /** + * Save bitmap 2 gallery. + * + * @param bm the bm + */ + public void saveBitmap2Gallery(Bitmap bm) { long currentTime = System.currentTimeMillis(); // name the file - String imageFileName = "IMG_AGORA_"+ currentTime + ".jpg"; + String imageFileName = "IMG_AGORA_" + currentTime + ".jpg"; String imageFilePath; - if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { imageFilePath = Environment.DIRECTORY_PICTURES + File.separator + "Agora" + File.separator; - else imageFilePath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES).getAbsolutePath() - + File.separator + "Agora"+ File.separator; + } else { + imageFilePath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES).getAbsolutePath() + + File.separator + "Agora" + File.separator; + } // write to file @@ -505,25 +511,25 @@ public void saveBitmap2Gallery(Bitmap bm){ ContentResolver resolver = requireContext().getContentResolver(); ContentValues newScreenshot = new ContentValues(); Uri insert; - newScreenshot.put(MediaStore.Images.ImageColumns.DATE_ADDED,currentTime); + newScreenshot.put(MediaStore.Images.ImageColumns.DATE_ADDED, currentTime); newScreenshot.put(MediaStore.Images.ImageColumns.DISPLAY_NAME, imageFileName); newScreenshot.put(MediaStore.Images.ImageColumns.MIME_TYPE, "image/jpg"); newScreenshot.put(MediaStore.Images.ImageColumns.WIDTH, bm.getWidth()); newScreenshot.put(MediaStore.Images.ImageColumns.HEIGHT, bm.getHeight()); try { if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) { - newScreenshot.put(MediaStore.Images.ImageColumns.RELATIVE_PATH,imageFilePath); - }else{ + newScreenshot.put(MediaStore.Images.ImageColumns.RELATIVE_PATH, imageFilePath); + } else { // make sure the path is existed File imageFileDir = new File(imageFilePath); - if(!imageFileDir.exists()){ + if (!imageFileDir.exists()) { boolean mkdir = imageFileDir.mkdirs(); - if(!mkdir) { + if (!mkdir) { showLongToast("save failed, error: cannot create folder. Make sure app has the permission."); return; } } - newScreenshot.put(MediaStore.Images.ImageColumns.DATA, imageFilePath+imageFileName); + newScreenshot.put(MediaStore.Images.ImageColumns.DATA, imageFilePath + imageFileName); newScreenshot.put(MediaStore.Images.ImageColumns.TITLE, imageFileName); } @@ -542,7 +548,7 @@ public void saveBitmap2Gallery(Bitmap bm){ showLongToast("save success, you can view it in gallery"); } catch (Exception e) { - showLongToast("save failed, error: "+ e.getMessage()); + showLongToast("save failed, error: " + e.getMessage()); e.printStackTrace(); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideo.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideo.java index 57783e1b4..c9d82a71a 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideo.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideo.java @@ -64,8 +64,9 @@ //) /** - * @deprecated The impletation of custom has been moved to {@link PushExternalVideoYUV}. - * You can refer to {@link PushExternalVideoYUV} example. + * The type Push external video. + * + * @deprecated The impletation of custom has been moved to {@link PushExternalVideoYUV}. You can refer to {@link PushExternalVideoYUV} example. */ public class PushExternalVideo extends BaseFragment implements View.OnClickListener, TextureView.SurfaceTextureListener, SurfaceTexture.OnFrameAvailableListener { @@ -137,30 +138,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -188,9 +189,9 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -209,8 +210,7 @@ public void onDestroy() { * triggers the removeInjectStreamUrl method.*/ engine.leaveChannel(); engine.stopPreview(); - if (textureBufferHelper != null) - { + if (textureBufferHelper != null) { textureBufferHelper.dispose(); textureBufferHelper = null; } @@ -237,8 +237,7 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); @@ -264,21 +263,21 @@ private void joinChannel(String channelId) { // Add to the local container fl_local.addView(textureView, new FrameLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.MATCH_PARENT)); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enables the video module. engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Configures the external video source. + /*Configures the external video source. * @param enable Sets whether or not to use the external video source: * true: Use the external video source. * false: Do not use the external video source. @@ -290,13 +289,13 @@ private void joinChannel(String channelId) { * ENCODED_VIDEO_FRAME: Use the ENCODED_VIDEO_FRAME*/ engine.setExternalVideoSource(true, true, Constants.ExternalVideoSourceType.VIDEO_FRAME); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, token -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -326,24 +325,19 @@ public void onFrameAvailable(SurfaceTexture surfaceTexture) { if (!mEglCore.isCurrent(mDrawSurface)) { mEglCore.makeCurrent(mDrawSurface); } - /** Use surfaceTexture's timestamp, in nanosecond */ - long timestampNs = -1; try { surfaceTexture.updateTexImage(); surfaceTexture.getTransformMatrix(mTransform); - timestampNs = surfaceTexture.getTimestamp(); - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); } - /**The rectangle ratio of frames and the screen surface may be different, so cropping may + /*The rectangle ratio of frames and the screen surface may be different, so cropping may * happen when display frames to the screen. * The display transformation matrix does not change for the same camera when the screen * orientation remains the same.*/ if (!mMVPMatrixInit) { - /***/ - /**For simplicity, we only consider the activity as portrait mode. In this case, the captured + /*For simplicity, we only consider the activity as portrait mode. In this case, the captured * images should be rotated 90 degrees (left or right).Thus the frame width and height * should be swapped.*/ float frameRatio = DEFAULT_CAPTURE_HEIGHT / (float) DEFAULT_CAPTURE_WIDTH; @@ -368,15 +362,14 @@ public void onFrameAvailable(SurfaceTexture surfaceTexture) { if (joined) { VideoFrame.Buffer buffer = textureBufferHelper.invoke(new Callable() { @Override - public VideoFrame.Buffer call() throws Exception - { - return textureBufferHelper.wrapTextureBuffer( DEFAULT_CAPTURE_HEIGHT, + public VideoFrame.Buffer call() throws Exception { + return textureBufferHelper.wrapTextureBuffer(DEFAULT_CAPTURE_HEIGHT, DEFAULT_CAPTURE_WIDTH, VideoFrame.TextureBuffer.Type.OES, mPreviewTexture, RendererCommon.convertMatrixToAndroidGraphicsMatrix(mTransform)); } }); VideoFrame frame = new VideoFrame(buffer, 0, 0); - /**Pushes the video frame using the AgoraVideoFrame class and passes the video frame to the Agora SDK. + /*Pushes the video frame using the AgoraVideoFrame class and passes the video frame to the Agora SDK. * Call the setExternalVideoSource method and set pushMode as true before calling this * method. Otherwise, a failure returns after calling this method. * @param frame AgoraVideoFrame @@ -396,13 +389,13 @@ public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int hei mTextureDestroyed = false; mSurfaceWidth = width; mSurfaceHeight = height; - /** handler associate to the GL thread which creates the texture. + /* handler associate to the GL thread which creates the texture. * in some condition SDK need to convert from texture format to YUV format, in this case, * SDK will use this handler to switch into the GL thread to complete the conversion. * */ mHandler = new Handler(Looper.myLooper()); mEglCore = new EglCore(); - if(!glPrepared){ + if (!glPrepared) { // setup egl context EglBase.Context eglContext = new EglBase14.Context(mEglCore.getEGLContext()); glPrepared = prepareGl(eglContext, width, height); @@ -420,19 +413,18 @@ public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int hei } try { mCamera = Camera.open(mFacing); - /**It is assumed to capture images of resolution 640x480. During development, it should + /*It is assumed to capture images of resolution 640x480. During development, it should * be the most suitable supported resolution that best fits the scenario.*/ Camera.Parameters parameters = mCamera.getParameters(); parameters.setPreviewSize(DEFAULT_CAPTURE_WIDTH, DEFAULT_CAPTURE_HEIGHT); mCamera.setParameters(parameters); mCamera.setPreviewTexture(mPreviewSurfaceTexture); - /**The display orientation is 90 for both front and back facing cameras using a surface + /*The display orientation is 90 for both front and back facing cameras using a surface * texture for the preview when the screen is in portrait mode.*/ mCamera.setDisplayOrientation(90); mCamera.startPreview(); mPreviewing = true; - } - catch (IOException e) { + } catch (IOException e) { e.printStackTrace(); } } @@ -519,14 +511,13 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); @@ -558,7 +549,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideoYUV.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideoYUV.java index f9af4c848..b4d13a9f4 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideoYUV.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/PushExternalVideoYUV.java @@ -55,6 +55,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Push external video yuv. + */ @Example( index = 7, group = ADVANCED, @@ -107,22 +110,22 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -130,7 +133,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -162,9 +165,9 @@ public void onDestroy() { videoFileReader.stop(); } - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -221,8 +224,7 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); @@ -249,10 +251,10 @@ private void joinChannel(String channelId) { return; } - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enables the video module. engine.enableVideo(); @@ -263,7 +265,7 @@ private void joinChannel(String channelId) { STANDARD_BITRATE, VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Configures the external video source. + /*Configures the external video source. * @param enable Sets whether or not to use the external video source: * true: Use the external video source. * false: Do not use the external video source. @@ -286,13 +288,13 @@ private void joinChannel(String channelId) { ViewGroup.LayoutParams.MATCH_PARENT)); engine.startPreview(Constants.VideoSourceType.VIDEO_SOURCE_CUSTOM); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -318,11 +320,11 @@ private void joinChannel(String channelId) { /** * Push video frame by i420. * - * @param yuv i420 data - * @param width width + * @param yuv i420 data + * @param width width * @param height height */ - private void pushVideoFrameByI420(byte[] yuv, int width, int height){ + private void pushVideoFrameByI420(byte[] yuv, int width, int height) { JavaI420Buffer i420Buffer = JavaI420Buffer.allocate(width, height); i420Buffer.getDataY().put(yuv, 0, i420Buffer.getDataY().limit()); i420Buffer.getDataU().put(yuv, i420Buffer.getDataY().limit(), i420Buffer.getDataU().limit()); @@ -353,11 +355,11 @@ private void pushVideoFrameByI420(byte[] yuv, int width, int height){ /** * Push video frame by nv21. * - * @param nv21 nv21 - * @param width width + * @param nv21 nv21 + * @param width width * @param height height */ - private void pushVideoFrameByNV21(byte[] nv21, int width, int height){ + private void pushVideoFrameByNV21(byte[] nv21, int width, int height) { VideoFrame.Buffer frameBuffer = new NV21Buffer(nv21, width, height, null); @@ -384,11 +386,11 @@ private void pushVideoFrameByNV21(byte[] nv21, int width, int height){ /** * Push video frame by nv12. * - * @param nv12 nv12 buffer. - * @param width width. + * @param nv12 nv12 buffer. + * @param width width. * @param height height. */ - private void pushVideoFrameByNV12(ByteBuffer nv12, int width, int height){ + private void pushVideoFrameByNV12(ByteBuffer nv12, int width, int height) { VideoFrame.Buffer frameBuffer = new NV12Buffer(width, height, width, height, nv12, null); /* @@ -414,13 +416,13 @@ private void pushVideoFrameByNV12(ByteBuffer nv12, int width, int height){ /** * Push video frame by texture id. * - * @param textureId texture id. + * @param textureId texture id. * @param textureType texture type. rgb or oes. - * @param width width. - * @param height height. + * @param width width. + * @param height height. */ @GLThread - private void pushVideoFrameByTexture(int textureId, VideoFrame.TextureBuffer.Type textureType, int width, int height){ + private void pushVideoFrameByTexture(int textureId, VideoFrame.TextureBuffer.Type textureType, int width, int height) { VideoFrame.Buffer frameBuffer = new TextureBuffer( EglBaseProvider.getCurrentEglContext(), width, @@ -535,14 +537,13 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); @@ -574,7 +575,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ fl_remote.removeAllViews(); @@ -588,8 +589,8 @@ public void run() { * Yuv 2 texture id. * Run on gl thread. * - * @param yuv yuv - * @param width width + * @param yuv yuv + * @param width width * @param height heigh * @return rgba texture id */ @@ -604,8 +605,8 @@ private int yuv2texture(byte[] yuv, int width, int height) { /** * Transform yuv to nv12 * - * @param yuv yuv - * @param width width + * @param yuv yuv + * @param width width * @param height height * @return nv12 */ @@ -638,8 +639,8 @@ private static ByteBuffer yuv2nv12(byte[] yuv, int width, int height) { /** * Transform yuv to nv21. * - * @param yuv yuv - * @param width width + * @param yuv yuv + * @param width width * @param height height * @return nv21 */ diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RTMPStreaming.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RTMPStreaming.java index 43455fb3f..30cf799c7 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RTMPStreaming.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RTMPStreaming.java @@ -25,6 +25,8 @@ import com.yanzhenjie.permission.AndPermission; import com.yanzhenjie.permission.runtime.Permission; +import java.util.ArrayList; + import io.agora.api.example.MainApplication; import io.agora.api.example.R; import io.agora.api.example.annotation.Example; @@ -55,8 +57,7 @@ actionId = R.id.action_mainFragment_to_RTCToRTMP, tipsId = R.string.rtmpstreaming ) -public class RTMPStreaming extends BaseFragment implements View.OnClickListener -{ +public class RTMPStreaming extends BaseFragment implements View.OnClickListener { private static final String TAG = RTMPStreaming.class.getSimpleName(); private LinearLayout llTransCode; @@ -68,27 +69,24 @@ public class RTMPStreaming extends BaseFragment implements View.OnClickListener private int myUid; private boolean joined = false, publishing = false; private VideoEncoderConfiguration.VideoDimensions dimensions = VD_640x360; - private LiveTranscoding transcoding = new LiveTranscoding(); - private static final Integer MAX_RETRY_TIMES = 3; + private final LiveTranscoding transcoding = new LiveTranscoding(); + private static final int MAX_RETRY_TIMES = 3; private int retried = 0; private boolean unpublishing = false; /** * Maximum number of users participating in transcoding (even number) */ private final int MAXUserCount = 2; - private LiveTranscoding.TranscodingUser localTranscodingUser; @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_rtmp_streaming, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); llTransCode = view.findViewById(R.id.ll_TransCode); transCodeSwitch = view.findViewById(R.id.transCode_Switch); @@ -103,42 +101,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -156,44 +151,37 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } - if(retryTask != null){ + if (retryTask != null) { retryTask.cancel(true); + retryTask = null; } handler.post(RtcEngine::destroy); engine = null; } @Override - public void onClick(View v) - { + public void onClick(View v) { - if (v.getId() == R.id.btn_join) - { - if(!joined) - { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -202,41 +190,38 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { + if (publishing) { + stopPublish(); + publishing = false; + } engine.leaveChannel(); transCodeSwitch.setEnabled(true); joined = false; join.setText(getString(R.string.join)); - publishing = false; publish.setEnabled(false); publish.setText(getString(R.string.publish)); + transcoding.setUsers(new ArrayList<>()); } - } - else if (v.getId() == R.id.btn_publish) - { - /**Ensure that the user joins a channel before calling this method.*/ + } else if (v.getId() == R.id.btn_publish) { + /*Ensure that the user joins a channel before calling this method.*/ retried = 0; if (joined && !publishing) { startPublish(); - } else if (joined && publishing) { + } else if (joined) { stopPublish(); } } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } @@ -247,27 +232,27 @@ private void joinChannel(String channelId) // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -289,27 +274,28 @@ private void joinChannel(String channelId) private void startPublish() { if (transCodeSwitch.isChecked()) { - /**LiveTranscoding: A class for managing user-specific CDN live audio/video transcoding settings. + /*LiveTranscoding: A class for managing user-specific CDN live audio/video transcoding settings. * See */ transcoding.width = dimensions.height; transcoding.height = dimensions.width; - /**The transcodingUser class which defines the video properties of the user displaying the + /*The transcodingUser class which defines the video properties of the user displaying the * video in the CDN live. Agora supports a maximum of 17 transcoding users in a CDN live streaming channel. * See */ - localTranscodingUser = new LiveTranscoding.TranscodingUser(); + LiveTranscoding.TranscodingUser localTranscodingUser = new LiveTranscoding.TranscodingUser(); localTranscodingUser.x = 0; localTranscodingUser.y = 0; localTranscodingUser.width = transcoding.width; localTranscodingUser.height = transcoding.height / MAXUserCount; localTranscodingUser.uid = myUid; - /**Adds a user displaying the video in CDN live. + localTranscodingUser.zOrder = 1; + /*Adds a user displaying the video in CDN live. * @return * 0: Success. * <0: Failure.*/ - int ret = transcoding.addUser(localTranscodingUser); + transcoding.addUser(localTranscodingUser); } - if(startRtmpStreaming() == 0){ + if (startRtmpStreaming() == 0) { retryTask = new AsyncTask() { @Override protected Object doInBackground(Object[] objects) { @@ -328,25 +314,24 @@ protected Object doInBackground(Object[] objects) { }; retryTask.execute(); } - /**Prevent repeated entry*/ + /*Prevent repeated entry*/ publish.setEnabled(false); - /**Prevent duplicate clicks*/ + /*Prevent duplicate clicks*/ transCodeSwitch.setEnabled(false); } - private int startRtmpStreaming(){ + private int startRtmpStreaming() { int code; - if(transCodeSwitch.isChecked()){ + if (transCodeSwitch.isChecked()) { code = engine.startRtmpStreamWithTranscoding(et_url.getText().toString(), transcoding); - } - else { + } else { code = engine.startRtmpStreamWithoutTranscoding(et_url.getText().toString()); } return code; } private void stopPublish() { - /**Removes an RTMP stream from the CDN. + /*Removes an RTMP stream from the CDN. * This method removes the RTMP URL address (added by addPublishStreamUrl) from a CDN live * stream. The SDK reports the result of this method call in the onRtmpStreamingStateChanged callback. * @param url The RTMP URL address to be removed. The maximum length of this parameter is @@ -363,23 +348,22 @@ private void stopPublish() { * This method removes only one stream RTMP URL address each time it is called.*/ unpublishing = true; retryTask.cancel(true); - int ret = engine.stopRtmpStream(et_url.getText().toString()); + engine.stopRtmpStream(et_url.getText().toString()); + transcoding.removeUser(myUid); } /** * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -387,8 +371,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -401,17 +384,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); publish.setEnabled(true); @@ -496,14 +476,12 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } - /**Since v2.4.1 * Occurs when the state of the RTMP streaming changes. * This callback indicates the state of the RTMP streaming. When exceptions occur, you can @@ -554,13 +532,12 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse public void onRtmpStreamingStateChanged(String url, int state, int errCode) { super.onRtmpStreamingStateChanged(url, state, errCode); Log.i(TAG, "onRtmpStreamingStateChanged->" + url + ", state->" + state + ", errCode->" + errCode); - if(retryTask == null){ + if (retryTask == null) { return; } - if(state == Constants.RTMP_STREAM_PUBLISH_STATE_RUNNING) - { - /**After confirming the successful push, make changes to the UI.*/ - if(errCode == Constants.RTMP_STREAM_PUBLISH_ERROR_OK){ + if (state == Constants.RTMP_STREAM_PUBLISH_STATE_RUNNING) { + /*After confirming the successful push, make changes to the UI.*/ + if (errCode == Constants.RTMP_STREAM_PUBLISH_REASON_OK) { publishing = true; retried = 0; retryTask.cancel(true); @@ -571,41 +548,38 @@ public void onRtmpStreamingStateChanged(String url, int state, int errCode) { } } else if (state == Constants.RTMP_STREAM_PUBLISH_STATE_FAILURE) { engine.stopRtmpStream(et_url.getText().toString()); - if((errCode == Constants.RTMP_STREAM_PUBLISH_ERROR_CONNECTION_TIMEOUT - || errCode == Constants.RTMP_STREAM_PUBLISH_ERROR_INTERNAL_SERVER_ERROR - || errCode == Constants.RTMP_STREAM_PUBLISH_ERROR_RTMP_SERVER_ERROR - || errCode == Constants.RTMP_STREAM_PUBLISH_ERROR_STREAM_NOT_FOUND - || errCode == Constants.RTMP_STREAM_PUBLISH_ERROR_NET_DOWN)) - { - /**need republishing.*/ + if (errCode == Constants.RTMP_STREAM_PUBLISH_REASON_CONNECTION_TIMEOUT + || errCode == Constants.RTMP_STREAM_PUBLISH_REASON_INTERNAL_SERVER_ERROR + || errCode == Constants.RTMP_STREAM_PUBLISH_REASON_RTMP_SERVER_ERROR + || errCode == Constants.RTMP_STREAM_PUBLISH_REASON_STREAM_NOT_FOUND + || errCode == Constants.RTMP_STREAM_PUBLISH_REASON_NET_DOWN) { + /*need republishing.*/ Log.w(TAG, "RTMP publish failure ->" + url + ", state->" + state + ", errorType->" + errCode); - } - else{ - /**Other failures which can't be recover by republishing, make changes to the UI.*/ + } else { + /*Other failures which can't be recover by republishing, make changes to the UI.*/ retryTask.cancel(true); unpublishing = true; } - } else if (state == Constants.RTMP_STREAM_PUBLISH_STATE_IDLE) { - if(unpublishing){ + } else if (state == Constants.RTMP_STREAM_PUBLISH_STATE_IDLE) { + if (unpublishing) { unpublishing = false; - /**Push stream not started or ended, make changes to the UI.*/ + publishing = false; + /*Push stream not started or ended, make changes to the UI.*/ handler.post(() -> { publish.setEnabled(true); publish.setText(getString(R.string.publish)); transCodeSwitch.setEnabled(true); }); - } - else if( retried >= MAX_RETRY_TIMES){ + } else if (retried >= MAX_RETRY_TIMES) { retryTask.cancel(true); retried = 0; - /**Push stream not started or ended, make changes to the UI.*/ + /*Push stream not started or ended, make changes to the UI.*/ handler.post(() -> { publish.setEnabled(true); publish.setText(getString(R.string.publish)); transCodeSwitch.setEnabled(true); }); - } - else{ + } else { retried++; startRtmpStreaming(); } @@ -617,23 +591,20 @@ else if( retried >= MAX_RETRY_TIMES){ * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Add to the remote container @@ -641,10 +612,10 @@ public void onUserJoined(int uid, int elapsed) // Setup remote video to render engine.setupRemoteVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, uid)); }); - /**Determine whether to open transcoding service and whether the current number of + /*Determine whether to open transcoding service and whether the current number of * transcoding users exceeds the maximum number of users*/ if (transCodeSwitch.isChecked() && transcoding.getUserCount() < MAXUserCount) { - /**The transcoding images are arranged vertically according to the adding order*/ + /*The transcoding images are arranged vertically according to the adding order*/ LiveTranscoding.TranscodingUser transcodingUser = new LiveTranscoding.TranscodingUser(); transcodingUser.x = 0; transcodingUser.y = transcoding.height / MAXUserCount; @@ -652,15 +623,15 @@ public void onUserJoined(int uid, int elapsed) transcodingUser.height = transcoding.height / MAXUserCount; transcodingUser.uid = uid; transcoding.addUser(transcodingUser); - /**refresh transCoding configuration*/ - int ret = engine.updateRtmpTranscoding(transcoding); + /*refresh transCoding configuration*/ + engine.updateRtmpTranscoding(transcoding); } } @Override public void onRtmpStreamingEvent(String url, int event) { super.onRtmpStreamingEvent(url, event); - if(event == Constants.RTMP_STREAMING_EVENT_URL_ALREADY_IN_USE){ + if (event == Constants.RTMP_STREAMING_EVENT_URL_ALREADY_IN_USE) { showLongToast(String.format("The URL %s is already in use.", url)); } } @@ -676,25 +647,24 @@ public void onRtmpStreamingEvent(String url, int event) { * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); - if(transcoding != null) { - /**Removes a user from CDN live. + if (transcoding != null) { + /*Removes a user from CDN live. * @return * 0: Success. * < 0: Failure.*/ int code = transcoding.removeUser(uid); if (code == Constants.ERR_OK) { - /**refresh transCoding configuration*/ + /*refresh transCoding configuration*/ engine.updateRtmpTranscoding(transcoding); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RhythmPlayer.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RhythmPlayer.java index da698a23f..a2bc2a308 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RhythmPlayer.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/RhythmPlayer.java @@ -59,23 +59,20 @@ public class RhythmPlayer extends BaseFragment implements View.OnClickListener, private ChannelMediaOptions mChannelMediaOptions; @Override - public void onCreate(@Nullable Bundle savedInstanceState) - { + public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); handler = new Handler(); } @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_rhythm_player, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); play = view.findViewById(R.id.play); @@ -91,42 +88,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -145,20 +139,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) engine.setLocalAccessPoint(localAccessPointConfiguration); } } - catch (Exception e) - { + catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.stopRhythmPlayer(); engine.leaveChannel(); } @@ -168,18 +159,14 @@ public void onDestroy() @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -187,16 +174,13 @@ public void onClick(View v) AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -216,9 +200,8 @@ public void onClick(View v) engine.leaveChannel(); join.setText(getString(R.string.join)); } - } - else if(v.getId() == R.id.play){ - if(!isPlaying){ + } else if (v.getId() == R.id.play) { + if (!isPlaying) { int ret = engine.startRhythmPlayer(URL_DOWNBEAT, URL_UPBEAT, agoraRhythmPlayerConfig); if (joined) { mChannelMediaOptions.publishRhythmPlayerTrack = true; @@ -230,8 +213,7 @@ else if(v.getId() == R.id.play){ beatPerMeasure.setEnabled(false); beatPerMinute.setEnabled(false); } - } - else if(v.getId() == R.id.stop){ + } else if (v.getId() == R.id.stop) { engine.stopRhythmPlayer(); if (joined) { mChannelMediaOptions.publishRhythmPlayerTrack = false; @@ -245,27 +227,27 @@ else if(v.getId() == R.id.stop){ /** * @param channelId Specify the channel name that you want to join. - * Users that input the same channel name join the same channel.*/ - private void joinChannel(String channelId) - { - /**In the demo, the default is to enter as the anchor.*/ + * Users that input the same channel name join the same channel. + */ + private void joinChannel(String channelId) { + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); engine.enableAudioVolumeIndication(1000, 3, true); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ mChannelMediaOptions = new ChannelMediaOptions(); mChannelMediaOptions.autoSubscribeAudio = true; mChannelMediaOptions.autoSubscribeVideo = true; mChannelMediaOptions.publishMicrophoneTrack = true; - /** + /* * config this for whether need push rhythem player to remote */ mChannelMediaOptions.publishRhythmPlayerTrack = isPlaying; @@ -284,18 +266,18 @@ private void joinChannel(String channelId) }); } - /**IRtcEngineEventHandler is an abstract class providing default implementation. - * The SDK uses this class to report to the app on SDK runtime events.*/ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + /** + * IRtcEngineEventHandler is an abstract class providing default implementation. + * The SDK uses this class to report to the app on SDK runtime events. + */ + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -303,8 +285,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -317,17 +298,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); play.setEnabled(true); @@ -381,8 +359,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); @@ -399,8 +376,7 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); } @@ -414,13 +390,12 @@ public void onActiveSpeaker(int uid) { @Override public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { - if(seekBar.getId() == R.id.beatsPerMeasure){ + if (seekBar.getId() == R.id.beatsPerMeasure) { agoraRhythmPlayerConfig.beatsPerMeasure = seekBar.getProgress() < 1 ? 1 : seekBar.getProgress(); - } - else if(seekBar.getId() == R.id.beatsPerMinute){ + } else if (seekBar.getId() == R.id.beatsPerMinute) { agoraRhythmPlayerConfig.beatsPerMinute = seekBar.getProgress() < 60 ? 60 : seekBar.getProgress(); } - Log.i(TAG, "agoraRhythmPlayerConfig beatsPerMeasure:"+ agoraRhythmPlayerConfig.beatsPerMeasure +", beatsPerMinute:" + agoraRhythmPlayerConfig.beatsPerMinute); + Log.i(TAG, "agoraRhythmPlayerConfig beatsPerMeasure:" + agoraRhythmPlayerConfig.beatsPerMeasure + ", beatsPerMinute:" + agoraRhythmPlayerConfig.beatsPerMinute); } @Override diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ScreenSharing.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ScreenSharing.java index 2189dda02..5bb3b867d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ScreenSharing.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ScreenSharing.java @@ -112,30 +112,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -161,7 +161,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); engine.stopScreenCapture(); @@ -210,8 +210,7 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); @@ -255,7 +254,7 @@ private void joinChannel(String channelId) { engine.setParameters("{\"che.video.mobile_1080p\":true}"); engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); - /**Enable video module*/ + /*Enable video module*/ engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( @@ -264,7 +263,7 @@ private void joinChannel(String channelId) { STANDARD_BITRATE, ORIENTATION_MODE_ADAPTIVE )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); DisplayMetrics metrics = new DisplayMetrics(); @@ -281,13 +280,13 @@ private void joinChannel(String channelId) { startScreenSharePreview(); } - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ // set options ChannelMediaOptions options = new ChannelMediaOptions(); @@ -350,7 +349,7 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) { public void onLocalVideoStateChanged(Constants.VideoSourceType source, int state, int error) { super.onLocalVideoStateChanged(source, state, error); Log.i(TAG, "onLocalVideoStateChanged source=" + source + ", state=" + state + ", error=" + error); - if(source == Constants.VideoSourceType.VIDEO_SOURCE_SCREEN_PRIMARY){ + if (source == Constants.VideoSourceType.VIDEO_SOURCE_SCREEN_PRIMARY) { if (state == Constants.LOCAL_VIDEO_STREAM_STATE_ENCODING) { if (error == Constants.ERR_OK) { showLongToast("Screen sharing start successfully."); @@ -466,7 +465,8 @@ private void leaveChannel() { join.setText(getString(R.string.join)); fl_local.removeAllViews(); fl_remote.removeAllViews(); - remoteUid = myUid = -1; + remoteUid = -1; + myUid = -1; engine.leaveChannel(); engine.stopScreenCapture(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SendDataStream.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SendDataStream.java index babee1404..c5db3c444 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SendDataStream.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SendDataStream.java @@ -40,6 +40,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Send data stream. + */ @Example( index = 20, group = ADVANCED, @@ -47,8 +50,10 @@ actionId = R.id.action_mainFragment_senddatastream, tipsId = R.string.senddatastream ) -public class SendDataStream extends BaseFragment implements View.OnClickListener -{ +public class SendDataStream extends BaseFragment implements View.OnClickListener { + /** + * The constant TAG. + */ public static final String TAG = SendDataStream.class.getSimpleName(); private FrameLayout fl_local, fl_remote; private Button send, join; @@ -63,15 +68,13 @@ public class SendDataStream extends BaseFragment implements View.OnClickListener @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_send_datastream, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); send = view.findViewById(R.id.btn_send); send.setOnClickListener(this); @@ -84,42 +87,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -138,20 +138,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) engine.setLocalAccessPoint(localAccessPointConfiguration); } } - catch (Exception e) - { + catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if (engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -159,18 +156,14 @@ public void onDestroy() } @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -179,16 +172,13 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -209,10 +199,8 @@ public void onClick(View v) send.setEnabled(false); join.setText(getString(R.string.join)); } - } - else if (v.getId() == R.id.btn_send) - { - /**Click once, the metadata is sent once. + } else if (v.getId() == R.id.btn_send) { + /*Click once, the metadata is sent once. * {@link SendDataStream#iMetadataObserver}. * The metadata here can be flexibly replaced according to your own business.*/ data = String.valueOf(new Date().toString()).getBytes(Charset.forName("UTF-8")); @@ -221,12 +209,10 @@ else if (v.getId() == R.id.btn_send) } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } @@ -237,35 +223,34 @@ private void joinChannel(String channelId) // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); option.autoSubscribeAudio = true; option.autoSubscribeVideo = true; int res = engine.joinChannel(accessToken, channelId, 0, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -282,16 +267,14 @@ private void joinChannel(String channelId) * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -300,8 +283,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -314,17 +296,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { send.setEnabled(true); join.setEnabled(true); join.setText(getString(R.string.leave)); @@ -408,8 +387,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -419,23 +397,20 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Add to the remote container @@ -457,16 +432,13 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { - /**Clear render view + public void run() { + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SimpleExtension.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SimpleExtension.java index bb7d4dd99..3897cae9d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SimpleExtension.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SimpleExtension.java @@ -55,13 +55,37 @@ ) public class SimpleExtension extends BaseFragment implements View.OnClickListener, io.agora.rtc2.IMediaExtensionObserver { private static final String TAG = SimpleExtension.class.getSimpleName(); + /** + * The constant EXTENSION_NAME. + */ public static final String EXTENSION_NAME = "agora-simple-filter"; // Name of target link library used in CMakeLists.txt + /** + * The constant EXTENSION_VENDOR_NAME. + */ public static final String EXTENSION_VENDOR_NAME = "Agora"; // Provider name used for registering in agora-bytedance.cpp + /** + * The constant EXTENSION_VIDEO_FILTER_WATERMARK. + */ public static final String EXTENSION_VIDEO_FILTER_WATERMARK = "Watermark"; // Video filter name defined in ExtensionProvider.h + /** + * The constant EXTENSION_AUDIO_FILTER_VOLUME. + */ public static final String EXTENSION_AUDIO_FILTER_VOLUME = "VolumeChange"; // Audio filter name defined in ExtensionProvider.h + /** + * The constant KEY_ENABLE_WATER_MARK. + */ public static final String KEY_ENABLE_WATER_MARK = "key"; + /** + * The constant ENABLE_WATER_MARK_FLAG. + */ public static final String ENABLE_WATER_MARK_FLAG = "plugin.watermark.wmEffectEnabled"; + /** + * The constant ENABLE_WATER_MARK_STRING. + */ public static final String ENABLE_WATER_MARK_STRING = "plugin.watermark.wmStr"; + /** + * The constant KEY_ADJUST_VOLUME_CHANGE. + */ public static final String KEY_ADJUST_VOLUME_CHANGE = "volume"; private FrameLayout local_view, remote_view; private EditText et_channel; @@ -72,11 +96,14 @@ public class SimpleExtension extends BaseFragment implements View.OnClickListene private SeekBar record; + /** + * The Seek bar change listener. + */ SeekBar.OnSeekBarChangeListener seekBarChangeListener = new SeekBar.OnSeekBarChangeListener() { @Override public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { - if(joined && seekBar.getId() == record.getId()){ - engine.setExtensionProperty(EXTENSION_VENDOR_NAME, EXTENSION_AUDIO_FILTER_VOLUME, KEY_ADJUST_VOLUME_CHANGE, ""+progress); + if (joined && seekBar.getId() == record.getId()) { + engine.setExtensionProperty(EXTENSION_VENDOR_NAME, EXTENSION_AUDIO_FILTER_VOLUME, KEY_ADJUST_VOLUME_CHANGE, "" + progress); } } @@ -125,7 +152,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { if (context == null) { return; } - if(!hasAgoraSimpleFilterLib()){ + if (!hasAgoraSimpleFilterLib()) { new AlertDialog.Builder(context) .setMessage(R.string.simple_extension_tip) .setCancelable(false) @@ -138,22 +165,22 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -163,9 +190,9 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.addExtension(EXTENSION_NAME); config.mExtensionObserver = this; config.mEventHandler = iRtcEngineEventHandler; - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -183,7 +210,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - /** + /* * Enable/Disable extension. * * @param id id for extension, e.g. agora.beauty. @@ -202,8 +229,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { engine.enableVideo(); // Create render view by RtcEngine TextureView textureView = new TextureView(context); - if(local_view.getChildCount() > 0) - { + if (local_view.getChildCount() > 0) { local_view.removeAllViews(); } // Add to the local container @@ -213,8 +239,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { engine.startPreview(); - } - catch (Exception e) { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -223,7 +248,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -231,7 +256,7 @@ public void onDestroy() { engine = null; } - private void setWaterMarkProperty(){ + private void setWaterMarkProperty() { String jsonValue = null; JSONObject o = new JSONObject(); @@ -263,14 +288,13 @@ public void onClick(View v) { AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -295,7 +319,7 @@ public void onClick(View v) { } } - private boolean hasAgoraSimpleFilterLib(){ + private boolean hasAgoraSimpleFilterLib() { try { Class aClass = Class.forName("io.agora.extension.ExtensionManager"); return aClass != null; @@ -309,11 +333,11 @@ private boolean hasAgoraSimpleFilterLib(){ * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(CLIENT_ROLE_BROADCASTER); engine.enableAudioVolumeIndication(1000, 3, false); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -397,18 +421,16 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; - } - else{ - handler.post(() -> - { - if(remote_view.getChildCount() > 0){ + } else { + handler.post(() -> { + if (remote_view.getChildCount() > 0) { remote_view.removeAllViews(); } - /**Display remote video stream*/ + /*Display remote video stream*/ TextureView textureView = null; // Create render view by RtcEngine textureView = new TextureView(context); @@ -437,7 +459,7 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java index c6a1e34ee..36e5fd7c3 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SpatialSound.java @@ -23,6 +23,7 @@ import com.google.android.material.bottomsheet.BottomSheetDialog; +import java.util.Arrays; import java.util.HashMap; import java.util.Map; @@ -36,6 +37,8 @@ import io.agora.mediaplayer.Constants; import io.agora.mediaplayer.IMediaPlayer; import io.agora.mediaplayer.IMediaPlayerObserver; +import io.agora.mediaplayer.data.CacheStatistics; +import io.agora.mediaplayer.data.PlayerPlaybackStats; import io.agora.mediaplayer.data.PlayerUpdatedInfo; import io.agora.mediaplayer.data.SrcInfo; import io.agora.rtc2.ChannelMediaOptions; @@ -50,6 +53,9 @@ import io.agora.spatialaudio.RemoteVoicePositionInfo; import io.agora.spatialaudio.SpatialAudioZone; +/** + * The type Spatial sound. + */ @Example( index = 22, group = ADVANCED, @@ -92,7 +98,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { return; } try { - /**Creates an RtcEngine instance. + /*Creates an RtcEngine instance. * @param context The context of Android Activity * @param appId The App ID issued to you by Agora. See * How to get the App ID @@ -105,7 +111,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mEventHandler = iRtcEngineEventHandler; config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -130,7 +136,6 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { localSpatialAudioConfig.mRtcEngine = engine; localSpatial.initialize(localSpatialAudioConfig); - //localSpatial.muteAllRemoteAudioStreams(true); localSpatial.setMaxAudioRecvCount(2); localSpatial.setAudioRecvRange(AXIS_MAX_DISTANCE); localSpatial.setDistanceUnit(1); @@ -176,6 +181,7 @@ protected void onPositionChanged() { float[] forward = new float[]{1.0F, 0.0F, 0.0F}; float[] right = new float[]{0.0F, 1.0F, 0.0F}; float[] up = new float[]{0.0F, 0.0F, 1.0F}; + Log.d(TAG, "updateSelfPosition >> pos=" + Arrays.toString(pos)); localSpatial.updateSelfPosition(pos, forward, right, up); } }); @@ -199,15 +205,9 @@ protected void onPositionChanged() { mediaPlayerLeftZone.rightLength = viewRelativeSizeInAxis[0]; mediaPlayerLeftZone.upLength = AXIS_MAX_DISTANCE; localSpatial.setZones(new SpatialAudioZone[]{mediaPlayerLeftZone}); - localSpatial.updatePlayerPositionInfo(mediaPlayerLeft.getMediaPlayerId(), getVoicePositionInfo(mediaPlayerLeftIv)); } else { zoneTv.setVisibility(View.INVISIBLE); - SpatialAudioZone worldZone = new SpatialAudioZone(); - worldZone.upLength = AXIS_MAX_DISTANCE * 2; - worldZone.forwardLength = AXIS_MAX_DISTANCE * 2; - worldZone.rightLength = AXIS_MAX_DISTANCE * 2; - localSpatial.setZones(new SpatialAudioZone[]{worldZone}); - localSpatial.updatePlayerPositionInfo(mediaPlayerLeft.getMediaPlayerId(), getVoicePositionInfo(mediaPlayerLeftIv)); + localSpatial.setZones(null); } }); } @@ -219,7 +219,7 @@ private void joinChannel() { engine.setClientRole(io.agora.rtc2.Constants.CLIENT_ROLE_BROADCASTER); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -229,7 +229,7 @@ private void joinChannel() { ChannelMediaOptions option = new ChannelMediaOptions(); option.autoSubscribeAudio = true; - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -299,7 +299,7 @@ private void initMediaPlayers() { private void showMediaPlayerSettingDialog(IMediaPlayer mediaPlayer) { String key = "MediaPlayer_" + mediaPlayer.getMediaPlayerId(); BottomSheetDialog dialog = cacheDialogs.get(key); - if(dialog != null){ + if (dialog != null) { dialog.show(); return; } @@ -346,7 +346,7 @@ public void onStopTrackingTouch(SeekBar seekBar) { private void showRemoteUserSettingDialog(int uid) { String key = "RemoteUser_" + uid; BottomSheetDialog dialog = cacheDialogs.get(key); - if(dialog != null){ + if (dialog != null) { dialog.show(); return; } @@ -398,7 +398,7 @@ private IMediaPlayer createLoopMediaPlayer() { IMediaPlayer mediaPlayer = engine.createMediaPlayer(); mediaPlayer.registerPlayerObserver(new IMediaPlayerObserver() { @Override - public void onPlayerStateChanged(Constants.MediaPlayerState state, Constants.MediaPlayerError error) { + public void onPlayerStateChanged(Constants.MediaPlayerState state, Constants.MediaPlayerReason reason) { if (state.equals(PLAYER_STATE_OPEN_COMPLETED)) { mediaPlayer.setLoopCount(-1); mediaPlayer.play(); @@ -406,7 +406,7 @@ public void onPlayerStateChanged(Constants.MediaPlayerState state, Constants.Med } @Override - public void onPositionChanged(long position_ms) { + public void onPositionChanged(long positionMs, long timestampMs) { } @@ -445,6 +445,16 @@ public void onPlayerInfoUpdated(PlayerUpdatedInfo info) { } + @Override + public void onPlayerCacheStats(CacheStatistics stats) { + + } + + @Override + public void onPlayerPlaybackStats(PlayerPlaybackStats stats) { + + } + @Override public void onAudioVolumeIndication(int volume) { @@ -465,22 +475,22 @@ private float[] getVoicePosition(View view) { float transY = view.getTranslationY(); double posForward = -1 * AXIS_MAX_DISTANCE * transY / ((rootView.getHeight()) / 2.0f); double posRight = AXIS_MAX_DISTANCE * transX / ((rootView.getWidth()) / 2.0f); - Log.d(TAG, "VoicePosition posForward=" + posForward + ", posRight=" + posRight); + //Log.d(TAG, "VoicePosition posForward=" + posForward + ", posRight=" + posRight); return new float[]{(float) posForward, (float) posRight, 0.0F}; } private float[] getViewRelativeSizeInAxis(View view) { return new float[]{ AXIS_MAX_DISTANCE * view.getWidth() * 1.0f / (rootView.getWidth() / 2.0f), - AXIS_MAX_DISTANCE * view.getHeight() * 1.0f / (rootView.getHeight() / 2.0f) , + AXIS_MAX_DISTANCE * view.getHeight() * 1.0f / (rootView.getHeight() / 2.0f), }; } private BottomSheetDialog showCommonSettingDialog(boolean isMute, SpatialAudioParams params, - CompoundButton.OnCheckedChangeListener muteCheckListener, - CompoundButton.OnCheckedChangeListener blurCheckListener, - CompoundButton.OnCheckedChangeListener airborneCheckListener, - SeekBar.OnSeekBarChangeListener attenuationSeekChangeListener + CompoundButton.OnCheckedChangeListener muteCheckListener, + CompoundButton.OnCheckedChangeListener blurCheckListener, + CompoundButton.OnCheckedChangeListener airborneCheckListener, + SeekBar.OnSeekBarChangeListener attenuationSeekChangeListener ) { BottomSheetDialog dialog = new BottomSheetDialog(requireContext()); View dialogView = LayoutInflater.from(requireContext()).inflate(R.layout.dialog_spatial_sound, null); @@ -571,12 +581,15 @@ public boolean onTouch(View v, MotionEvent event) { v.setTranslationY(newTranY); onPositionChanged(); break; - case MotionEvent.ACTION_UP: + default: break; } return true; } + /** + * On position changed. + */ protected abstract void onPositionChanged(); } @@ -584,7 +597,7 @@ public boolean onTouch(View v, MotionEvent event) { * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private class InnerRtcEngineEventHandler extends IRtcEngineEventHandler { + private final class InnerRtcEngineEventHandler extends IRtcEngineEventHandler { @Override public void onJoinChannelSuccess(String channel, int uid, int elapsed) { super.onJoinChannelSuccess(channel, uid, elapsed); @@ -633,7 +646,9 @@ public void onUserJoined(int uid, int elapsed) { remoteLeftTv.setTag(uid); remoteLeftTv.setVisibility(View.VISIBLE); remoteLeftTv.setText(uid + ""); - localSpatial.updateRemotePosition(uid, getVoicePositionInfo(remoteLeftTv)); + RemoteVoicePositionInfo info = getVoicePositionInfo(remoteLeftTv); + Log.d(TAG, "left remote user >> pos=" + Arrays.toString(info.position)); + localSpatial.updateRemotePosition(uid, info); remoteLeftTv.setOnClickListener(v -> showRemoteUserSettingDialog(uid)); } else if (remoteRightTv.getTag() == null) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SwitchCameraScreenShare.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SwitchCameraScreenShare.java index 79ec44673..c43715dcc 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SwitchCameraScreenShare.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/SwitchCameraScreenShare.java @@ -105,30 +105,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -154,9 +154,9 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { - if(camera.isChecked()){ + if (camera.isChecked()) { engine.leaveChannelEx(rtcConnection2); } engine.leaveChannel(); @@ -187,8 +187,7 @@ public void onCheckedChanged(CompoundButton compoundButton, boolean b) { options.publishScreenCaptureAudio = true; engine.updateChannelMediaOptions(options); addScreenSharePreview(); - } - else{ + } else { // stop screen capture and update options engine.stopScreenCapture(); options.publishScreenCaptureVideo = false; @@ -200,26 +199,25 @@ public void onCheckedChanged(CompoundButton compoundButton, boolean b) { showAlert(getString(R.string.lowversiontip)); } } else if (compoundButton.getId() == R.id.camera) { - if(b){ - ChannelMediaOptions mediaOptions = new ChannelMediaOptions(); - mediaOptions.autoSubscribeAudio = false; - mediaOptions.autoSubscribeVideo = false; - mediaOptions.publishScreenCaptureVideo = false; - mediaOptions.publishCameraTrack = true; - mediaOptions.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; - mediaOptions.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - rtcConnection2.channelId = et_channel.getText().toString(); - rtcConnection2.localUid = new Random().nextInt(512)+512; - engine.joinChannelEx(null ,rtcConnection2,mediaOptions,iRtcEngineEventHandler); - } - else{ - engine.leaveChannelEx(rtcConnection2); - engine.startPreview(Constants.VideoSourceType.VIDEO_SOURCE_CAMERA_PRIMARY); - } - }else if (compoundButton.getId() == R.id.screenSharePreview) { - if(b){ + if (b) { + ChannelMediaOptions mediaOptions = new ChannelMediaOptions(); + mediaOptions.autoSubscribeAudio = false; + mediaOptions.autoSubscribeVideo = false; + mediaOptions.publishScreenCaptureVideo = false; + mediaOptions.publishCameraTrack = true; + mediaOptions.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; + mediaOptions.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; + rtcConnection2.channelId = et_channel.getText().toString(); + rtcConnection2.localUid = new Random().nextInt(512) + 512; + engine.joinChannelEx(null, rtcConnection2, mediaOptions, iRtcEngineEventHandler); + } else { + engine.leaveChannelEx(rtcConnection2); + engine.startPreview(Constants.VideoSourceType.VIDEO_SOURCE_CAMERA_PRIMARY); + } + } else if (compoundButton.getId() == R.id.screenSharePreview) { + if (b) { addScreenSharePreview(); - }else{ + } else { fl_screen.removeAllViews(); engine.stopPreview(Constants.VideoSourceType.VIDEO_SOURCE_SCREEN_PRIMARY); } @@ -243,8 +241,7 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); @@ -261,16 +258,16 @@ public void onClick(View v) { engine.leaveChannelEx(rtcConnection2); camera.setChecked(false); } - if(screenSharePreview.isChecked()){ + if (screenSharePreview.isChecked()) { engine.stopPreview(Constants.VideoSourceType.VIDEO_SOURCE_SCREEN_PRIMARY); screenSharePreview.setChecked(false); } - if(screenShare.isChecked()){ + if (screenShare.isChecked()) { engine.stopScreenCapture(); screenShare.setChecked(false); } engine.stopPreview(Constants.VideoSourceType.VIDEO_SOURCE_CAMERA_PRIMARY); - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -351,7 +348,7 @@ private void joinChannel(String channelId) { options.enableAudioRecordingOrPlayout = false; options.publishEncodedVideoTrack = false; - /**Enable video module*/ + /*Enable video module*/ engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( @@ -360,16 +357,16 @@ private void joinChannel(String channelId) { STANDARD_BITRATE, ORIENTATION_MODE_ADAPTIVE )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(accessToken, channelId, 0, options); if (res != 0) { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ThirdPartyBeauty.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ThirdPartyBeauty.java index 84d6deced..8e321bad9 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ThirdPartyBeauty.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ThirdPartyBeauty.java @@ -26,6 +26,9 @@ import io.agora.api.example.examples.advanced.beauty.FaceUnityBeautySDK; import io.agora.api.example.examples.advanced.beauty.SenseTimeBeautySDK; +/** + * The type Third party beauty. + */ @Example( index = 24, group = ADVANCED, diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoMetadata.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoMetadata.java index a9f04499a..ef2ff8b4a 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoMetadata.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoMetadata.java @@ -42,6 +42,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Video metadata. + */ @Example( index = 3, group = ADVANCED, @@ -49,8 +52,10 @@ actionId = R.id.action_mainFragment_to_VideoMetadata, tipsId = R.string.videometadata ) -public class VideoMetadata extends BaseFragment implements View.OnClickListener -{ +public class VideoMetadata extends BaseFragment implements View.OnClickListener { + /** + * The constant TAG. + */ public static final String TAG = VideoMetadata.class.getSimpleName(); private FrameLayout fl_local, fl_remote; private Button send, join; @@ -69,15 +74,13 @@ public class VideoMetadata extends BaseFragment implements View.OnClickListener @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_video_metadata, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); send = view.findViewById(R.id.btn_send); send.setOnClickListener(this); @@ -90,42 +93,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -144,20 +144,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) engine.setLocalAccessPoint(localAccessPointConfiguration); } } - catch (Exception e) - { + catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if (engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -165,18 +162,14 @@ public void onDestroy() } @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -185,16 +178,13 @@ public void onClick(View v) Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -215,22 +205,18 @@ public void onClick(View v) send.setEnabled(false); join.setText(getString(R.string.join)); } - } - else if (v.getId() == R.id.btn_send) - { - /**Click once, the metadata is sent once. + } else if (v.getId() == R.id.btn_send) { + /*Click once, the metadata is sent once. * {@link VideoMetadata#iMetadataObserver}. * The metadata here can be flexibly replaced according to your own business.*/ metadata = String.valueOf(System.currentTimeMillis()).getBytes(Charset.forName("UTF-8")); } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } @@ -241,7 +227,7 @@ private void joinChannel(String channelId) // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, RENDER_MODE_HIDDEN, 0)); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -252,26 +238,25 @@ private void joinChannel(String channelId) STANDARD_BITRATE, ORIENTATION_MODE_ADAPTIVE )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**register metadata observer + /*register metadata observer * @return 0锛歋uccess * < 0锛欶ailure*/ int code = engine.registerMediaMetadataObserver(iMetadataObserver, IMetadataObserver.VIDEO_METADATA); Log.e(TAG, code + ""); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(accessToken, channelId, "Extra Optional Data", 0); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -287,12 +272,10 @@ private void joinChannel(String channelId) /** * By implementing this interface, metadata can be sent and received with video frames. */ - private final IMetadataObserver iMetadataObserver = new IMetadataObserver() - { + private final IMetadataObserver iMetadataObserver = new IMetadataObserver() { /**Returns the maximum data size of Metadata*/ @Override - public int getMaxMetadataSize() - { + public int getMaxMetadataSize() { return MAX_META_SIZE; } @@ -302,19 +285,16 @@ public int getMaxMetadataSize() * @return The metadata that you want to send in the format of byte[]. Ensure that you set the return value. * PS: Ensure that the size of the metadata does not exceed the value set in the getMaxMetadataSize callback.*/ @Override - public byte[] onReadyToSendMetadata(long timeStampMs, int sourceType) - { - /**Check if the metadata is empty.*/ - if (metadata == null) - { + public byte[] onReadyToSendMetadata(long timeStampMs, int sourceType) { + /*Check if the metadata is empty.*/ + if (metadata == null) { return null; } Log.i(TAG, "There is metadata to send!"); - /**Recycle metadata objects.*/ + /*Recycle metadata objects.*/ byte[] toBeSend = metadata; metadata = null; - if (toBeSend.length > MAX_META_SIZE) - { + if (toBeSend.length > MAX_META_SIZE) { Log.e(TAG, String.format("Metadata exceeding max length %d!", MAX_META_SIZE)); return null; } @@ -334,8 +314,7 @@ public void run() { * @param uid The ID of the user who sent the metadata. * @param timeStampMs The timestamp (ms) of the received metadata.*/ @Override - public void onMetadataReceived(byte[] buffer, int uid, long timeStampMs) - { + public void onMetadataReceived(byte[] buffer, int uid, long timeStampMs) { String data = new String(buffer, Charset.forName("UTF-8")); handler.post(new Runnable() { @Override @@ -351,8 +330,7 @@ public void run() { * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: @@ -360,8 +338,7 @@ public void run() { * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.e(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); showAlert(String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -370,8 +347,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -384,17 +360,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { send.setEnabled(true); join.setEnabled(true); join.setText(getString(R.string.leave)); @@ -478,8 +451,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -489,23 +461,20 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - /**Display remote video stream*/ + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); - if (fl_remote.getChildCount() > 0) - { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } // Add to the remote container @@ -527,16 +496,13 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { - /**Clear render view + public void run() { + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoProcessExtension.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoProcessExtension.java index cad9ec869..b01a5c673 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoProcessExtension.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoProcessExtension.java @@ -79,15 +79,13 @@ public class VideoProcessExtension extends BaseFragment implements View.OnClickL @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_video_enhancement, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); join.setOnClickListener(this); @@ -148,49 +146,46 @@ private void resetVirtualBackground() { backgroundSource.source = "https://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/resources/sample.mp4"; } engine.enableVirtualBackground(true, backgroundSource, segproperty); - }else{ + } else { engine.enableVirtualBackground(false, null, null); } } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -210,21 +205,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) } engine.enableExtension("agora_video_filters_clear_vision", "clear_vision", true); - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -232,20 +223,16 @@ public void onDestroy() } - - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } // Add to the local container @@ -255,25 +242,25 @@ private void joinChannel(String channelId) // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) )); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); option.autoSubscribeAudio = true; @@ -281,8 +268,7 @@ private void joinChannel(String channelId) option.publishMicrophoneTrack = true; option.publishCameraTrack = true; int res = engine.joinChannel(accessToken, channelId, 0, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -297,16 +283,13 @@ private void joinChannel(String channelId) @Override public void onClick(View v) { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelId); return; } @@ -315,16 +298,13 @@ public void onClick(View v) { Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -350,7 +330,7 @@ public void onClick(View v) { @Override public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) { - if(buttonView.getId() == beauty.getId()){ + if (buttonView.getId() == beauty.getId()) { if (isChecked && !engine.isFeatureAvailableOnDevice(Constants.FEATURE_VIDEO_BEAUTY_EFFECT)) { buttonView.setChecked(false); Toast.makeText(requireContext(), R.string.feature_unavailable, Toast.LENGTH_SHORT).show(); @@ -358,33 +338,29 @@ public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) { } engine.setBeautyEffectOptions(isChecked, beautyOptions); - } - else if(buttonView.getId() == lightness2.getId()){ + } else if (buttonView.getId() == lightness2.getId()) { LowLightEnhanceOptions options = new LowLightEnhanceOptions(); options.lowlightEnhanceLevel = LowLightEnhanceOptions.LOW_LIGHT_ENHANCE_LEVEL_FAST; options.lowlightEnhanceMode = LowLightEnhanceOptions.LOW_LIGHT_ENHANCE_AUTO; engine.setLowlightEnhanceOptions(isChecked, options); - } - else if(buttonView.getId() == colorful2.getId()){ + } else if (buttonView.getId() == colorful2.getId()) { setColorEnhance(isChecked); - } - else if(buttonView.getId() == virtualBackground.getId()){ + } else if (buttonView.getId() == virtualBackground.getId()) { if (isChecked && !engine.isFeatureAvailableOnDevice(Constants.FEATURE_VIDEO_VIRTUAL_BACKGROUND)) { buttonView.setChecked(false); Toast.makeText(requireContext(), R.string.feature_unavailable, Toast.LENGTH_SHORT).show(); return; } resetVirtualBackground(); - } - else if(buttonView.getId() == noiseReduce2.getId()){ + } else if (buttonView.getId() == noiseReduce2.getId()) { VideoDenoiserOptions options = new VideoDenoiserOptions(); - options.denoiserLevel = VideoDenoiserOptions.VIDEO_DENOISER_AUTO; - options.denoiserMode = VideoDenoiserOptions.VIDEO_DENOISER_LEVEL_HIGH_QUALITY; + options.denoiserLevel = VideoDenoiserOptions.VIDEO_DENOISER_LEVEL_HIGH_QUALITY; + options.denoiserMode = VideoDenoiserOptions.VIDEO_DENOISER_AUTO; engine.setVideoDenoiserOptions(isChecked, options); } } - private void setColorEnhance(boolean isChecked){ + private void setColorEnhance(boolean isChecked) { ColorEnhanceOptions options = new ColorEnhanceOptions(); options.strengthLevel = (float) strength; options.skinProtectLevel = (float) skinProtect; @@ -394,27 +370,22 @@ private void setColorEnhance(boolean isChecked){ @Override public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { float value = ((float) progress) / 10; - if(seekBar.getId() == seek_lightness.getId()){ + if (seekBar.getId() == seek_lightness.getId()) { beautyOptions.lighteningLevel = value; engine.setBeautyEffectOptions(beauty.isChecked(), beautyOptions); - } - else if(seekBar.getId() == seek_redness.getId()){ + } else if (seekBar.getId() == seek_redness.getId()) { beautyOptions.rednessLevel = value; engine.setBeautyEffectOptions(beauty.isChecked(), beautyOptions); - } - else if(seekBar.getId() == seek_sharpness.getId()){ + } else if (seekBar.getId() == seek_sharpness.getId()) { beautyOptions.sharpnessLevel = value; engine.setBeautyEffectOptions(beauty.isChecked(), beautyOptions); - } - else if(seekBar.getId() == seek_smoothness.getId()){ + } else if (seekBar.getId() == seek_smoothness.getId()) { beautyOptions.smoothnessLevel = value; engine.setBeautyEffectOptions(beauty.isChecked(), beautyOptions); - } - else if(seekBar.getId() == seek_strength.getId()) { + } else if (seekBar.getId() == seek_strength.getId()) { strength = value; setColorEnhance(colorful2.isChecked()); - } - else if(seekBar.getId() == seek_skin.getId()) { + } else if (seekBar.getId() == seek_skin.getId()) { skinProtect = value; setColorEnhance(colorful2.isChecked()); } @@ -435,16 +406,14 @@ public void onStopTrackingTouch(SeekBar seekBar) { * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.w(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -452,8 +421,7 @@ public void onError(int err) * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -466,17 +434,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); controlPanel.setVisibility(View.VISIBLE); @@ -560,8 +525,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -571,23 +535,20 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; - } - else{ - handler.post(() -> - { - if(fl_remote.getChildCount() > 0){ + } else { + handler.post(() -> { + if (fl_remote.getChildCount() > 0) { fl_remote.removeAllViews(); } - /**Display remote video stream*/ + /*Display remote video stream*/ SurfaceView surfaceView = null; // Create render view by RtcEngine surfaceView = new SurfaceView(context); @@ -611,14 +572,13 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoQuickSwitch.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoQuickSwitch.java index 8a7d6e28f..79b25a570 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoQuickSwitch.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VideoQuickSwitch.java @@ -1,7 +1,7 @@ package io.agora.api.example.examples.advanced; import static io.agora.api.example.common.model.Examples.ADVANCED; -import static io.agora.rtc2.Constants.REMOTE_VIDEO_STATE_PLAYING; +import static io.agora.rtc2.Constants.REMOTE_VIDEO_STATE_DECODING; import static io.agora.rtc2.video.VideoCanvas.RENDER_MODE_HIDDEN; import static io.agora.rtc2.video.VideoEncoderConfiguration.FRAME_RATE.FRAME_RATE_FPS_15; import static io.agora.rtc2.video.VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE; @@ -47,13 +47,16 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; -/**---------------------------------------Important!!!---------------------------------------------- +/** + * ---------------------------------------Important!!!---------------------------------------------- * This example demonstrates how audience can quickly switch channels. The following points need to be noted: - 1: You can only access the channel as an audience{@link VideoQuickSwitch#joinChannel(String)}. - 2: If you want to see a normal remote screen, you need to set up several live rooms in advance and - push the stream as a live one (the name of the live room is in the channels instance{"channel0", "channel1", "channel2"}; - at the same time, the appid you used to set up the live room should be consistent with this example program). - * @author cjw*/ + * 1: You can only access the channel as an audience{@link VideoQuickSwitch#joinChannel(String)}. + * 2: If you want to see a normal remote screen, you need to set up several live rooms in advance and + * push the stream as a live one (the name of the live room is in the channels instance{"channel0", "channel1", "channel2"}; + * at the same time, the appid you used to set up the live room should be consistent with this example program). + * + * @author cjw + */ @Example( index = 12, group = ADVANCED, @@ -61,8 +64,7 @@ actionId = R.id.action_mainFragment_to_QuickSwitch, tipsId = R.string.quickswitchchannel ) -public class VideoQuickSwitch extends BaseFragment implements CompoundButton.OnCheckedChangeListener -{ +public class VideoQuickSwitch extends BaseFragment implements CompoundButton.OnCheckedChangeListener { private static final String TAG = VideoQuickSwitch.class.getSimpleName(); private ViewPager viewPager; private RtcEngine engine; @@ -76,56 +78,50 @@ public class VideoQuickSwitch extends BaseFragment implements CompoundButton.OnC private FrameLayout fl_local; private boolean publish = false; private Switch localVideo; - private Runnable runnable = new Runnable() - { + private Runnable runnable = new Runnable() { @Override - public void run() - { - if(noBroadcaster) - { - /**There is no broadcaster in the current channel*/ + public void run() { + if (noBroadcaster) { + /*There is no broadcaster in the current channel*/ viewPagerAdapter.notifyBroadcaster(currentIndex, !noBroadcaster); } } }; @Override - public void onCreate(@Nullable Bundle savedInstanceState) - { + public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -143,9 +139,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } @@ -153,48 +147,39 @@ public void onCreate(@Nullable Bundle savedInstanceState) @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_quick_switch_channel, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); fl_local = view.findViewById(R.id.local_video); localVideo = view.findViewById(R.id.enableLocal); viewPager = view.findViewById(R.id.viewPager); - /**Prepare data*/ - for (String channel : channels) - { + /*Prepare data*/ + for (String channel : channels) { channelList.add(channel); } viewPagerAdapter = new ViewPagerAdapter(getContext(), channelList); viewPager.setAdapter(viewPagerAdapter); - viewPager.addOnPageChangeListener(new ViewPager.OnPageChangeListener() - { + viewPager.addOnPageChangeListener(new ViewPager.OnPageChangeListener() { @Override - public void onPageScrolled(int position, float positionOffset, int positionOffsetPixels) - { - if (positionOffset == 0f && position != currentIndex) - { - viewPager.post(new Runnable() - { + public void onPageScrolled(int position, float positionOffset, int positionOffsetPixels) { + if (positionOffset == 0f && position != currentIndex) { + viewPager.post(new Runnable() { @Override - public void run() - { + public void run() { Log.i(TAG, "Will switch channel to " + channelList.get(position)); currentIndex = position; - if (lastIndex >= 0) - { + if (lastIndex >= 0) { viewPagerAdapter.removeSurfaceViewByIndex(lastIndex); viewPagerAdapter.notifyBroadcaster(lastIndex, true); } - /**Since v2.9.0. + /*Since v2.9.0. * Switches to a different channel. * This method allows the audience of a Live-broadcast channel to switch to a different channel. * After the user successfully switches to another channel, the onLeaveChannel @@ -231,25 +216,25 @@ public void run() }); } } + @Override - public void onPageSelected(int position) - {} + public void onPageSelected(int position) { + } + @Override - public void onPageScrollStateChanged(int state) - {} + public void onPageScrollStateChanged(int state) { + } }); - /**Swipe left and right to switch channel tips*/ + /*Swipe left and right to switch channel tips*/ showAlert(getString(R.string.swiptips)); localVideo.setOnCheckedChangeListener(this); } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check permission - if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) - { + if (AndPermission.hasPermissions(this, Permission.Group.STORAGE, Permission.Group.MICROPHONE, Permission.Group.CAMERA)) { joinChannel(channelList.get(0)); return; } @@ -257,39 +242,33 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) AndPermission.with(this).runtime().permission( Permission.READ_EXTERNAL_STORAGE, Permission.WRITE_EXTERNAL_STORAGE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelList.get(0)); }).start(); } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if (engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); engine = null; } - private final void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - if(publish){ + if (publish) { // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } surfaceView.setZOrderMediaOverlay(true); @@ -299,8 +278,8 @@ private final void joinChannel(String channelId) // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, Constants.RENDER_MODE_HIDDEN, 0)); } - /**In the demo, the default is to enter as the broadcaster.*/ - engine.setClientRole(publish?Constants.CLIENT_ROLE_BROADCASTER:Constants.CLIENT_ROLE_AUDIENCE); + /*In the demo, the default is to enter as the broadcaster.*/ + engine.setClientRole(publish ? Constants.CLIENT_ROLE_BROADCASTER : Constants.CLIENT_ROLE_AUDIENCE); engine.startPreview(); // Enable video module engine.enableVideo(); @@ -311,16 +290,16 @@ private final void joinChannel(String channelId) STANDARD_BITRATE, ORIENTATION_MODE_ADAPTIVE )); - /**Set up to play remote sound with receiver*/ + /*Set up to play remote sound with receiver*/ engine.setDefaultAudioRoutetoSpeakerphone(true); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /**Allows a user to join a channel. + /*Allows a user to join a channel. * if you do not specify the uid, we will generate the uid for you. * If your account has enabled token mechanism through the console, you must fill in the * corresponding token here. In general, it is not recommended to open the token mechanism in the test phase.*/ @@ -330,8 +309,7 @@ private final void joinChannel(String channelId) option.publishMicrophoneTrack = true; option.publishCameraTrack = true; int res = engine.joinChannel(accessToken, channelId, 0, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -346,16 +324,14 @@ private final void joinChannel(String channelId) * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror */ @Override - public void onError(int err) - { + public void onError(int err) { Log.e(TAG, String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); showAlert(String.format("onError code %d message %s", err, RtcEngine.getErrorDescription(err))); } @@ -367,8 +343,7 @@ public void onError(int err) * Important! Because the channel is entered by the role of an audience, this callback will * only be received when the broadcaster exits the channel.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -382,12 +357,11 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; - /**Determine if there is a host in the channel*/ + /*Determine if there is a host in the channel*/ noBroadcaster = true; handler.post(runnable); } @@ -440,7 +414,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * to REMOTE_VIDEO_STATE_REASON_LOCAL_MUTED(3), REMOTE_VIDEO_STATE_REASON_REMOTE_MUTED(5), * or REMOTE_VIDEO_STATE_REASON_REMOTE_OFFLINE(7). * REMOTE_VIDEO_STATE_STARTING(1): The first remote video packet is received. - * REMOTE_VIDEO_STATE_PLAYING(2): The remote video stream is decoded and plays normally, + * REMOTE_VIDEO_STATE_DECODING(2): The remote video stream is decoded and plays normally, * probably due to REMOTE_VIDEO_STATE_REASON_NETWORK_RECOVERY (2), * REMOTE_VIDEO_STATE_REASON_LOCAL_UNMUTED(4), REMOTE_VIDEO_STATE_REASON_REMOTE_UNMUTED(6), * or REMOTE_VIDEO_STATE_REASON_AUDIO_FALLBACK_RECOVERY(9). @@ -468,13 +442,11 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); - if(state == REMOTE_VIDEO_STATE_PLAYING) - { - /**REMOTE_VIDEO_STATE_PLAYING as the basis for judging whether there is a broadcaster + if (state == REMOTE_VIDEO_STATE_DECODING) { + /*REMOTE_VIDEO_STATE_DECODING as the basis for judging whether there is a broadcaster * in the channel. * But you should judge according to your own business logic, here is just for example, * not for reference.*/ @@ -490,20 +462,17 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * Important! Because the channel is entered by the role of an audience, this callback will * only be received when the broadcaster exits the channel.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - handler.post(() -> - { - if(uid != myUid) - { + handler.post(() -> { + if (uid != myUid) { SurfaceView surfaceV = new SurfaceView(getContext().getApplicationContext()); surfaceV.setZOrderMediaOverlay(true); engine.setupRemoteVideo(new VideoCanvas(surfaceV, VideoCanvas.RENDER_MODE_HIDDEN, uid)); @@ -526,16 +495,13 @@ public void onUserJoined(int uid, int elapsed) * Important! Because the channel is entered by the role of an audience, this callback will * only be received when the broadcaster exits the channel.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { - /**Clear render view + public void run() { + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); @@ -548,11 +514,10 @@ public void run() @Override public void onCheckedChanged(CompoundButton compoundButton, boolean b) { publish = b; - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } - if(b){ + if (b) { engine.startPreview(); // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(getContext()); @@ -563,27 +528,32 @@ public void onCheckedChanged(CompoundButton compoundButton, boolean b) { // Setup local video to render your local camera preview engine.setupLocalVideo(new VideoCanvas(surfaceView, Constants.RENDER_MODE_HIDDEN, 0)); } - engine.setClientRole(b?Constants.CLIENT_ROLE_BROADCASTER:Constants.CLIENT_ROLE_AUDIENCE); + engine.setClientRole(b ? Constants.CLIENT_ROLE_BROADCASTER : Constants.CLIENT_ROLE_AUDIENCE); } - public class ViewPagerAdapter extends PagerAdapter - { + /** + * The type View pager adapter. + */ + public class ViewPagerAdapter extends PagerAdapter { private SparseArray viewList = new SparseArray<>(); private Context context; private List roomNameList = new ArrayList<>(); - public ViewPagerAdapter(Context context, List roomNameList) - { + /** + * Instantiates a new View pager adapter. + * + * @param context the context + * @param roomNameList the room name list + */ + public ViewPagerAdapter(Context context, List roomNameList) { this.context = context; this.roomNameList = roomNameList; } @Override - public Object instantiateItem(ViewGroup collection, int position) - { + public Object instantiateItem(ViewGroup collection, int position) { ViewGroup layout = viewList.get(position); - if (layout == null) - { + if (layout == null) { LayoutInflater inflater = LayoutInflater.from(context); layout = (ViewGroup) inflater.inflate(R.layout.view_item_quickswitch, collection, false); viewList.put(position, layout); @@ -598,22 +568,18 @@ public Object instantiateItem(ViewGroup collection, int position) } @Override - public void destroyItem(ViewGroup collection, int position, Object view) - { + public void destroyItem(ViewGroup collection, int position, Object view) { collection.removeView((View) view); } @Override - public int getCount() - { + public int getCount() { return roomNameList.size(); } - private void setSurfaceView(int position, final int uid, final SurfaceView view) - { + private void setSurfaceView(int position, final int uid, final SurfaceView view) { final ViewGroup viewGroup = viewList.get(position); - if (viewGroup != null) - { + if (viewGroup != null) { ViewGroup surfaceContainer = viewGroup.findViewById(R.id.fl_remote); surfaceContainer.removeAllViews(); view.setZOrderMediaOverlay(true); @@ -627,30 +593,24 @@ private void setSurfaceView(int position, final int uid, final SurfaceView view) } } - private void removeSurfaceView(int uid) - { - for (int i = 0; i < viewList.size(); i++) - { + private void removeSurfaceView(int uid) { + for (int i = 0; i < viewList.size(); i++) { ViewGroup viewGroup = viewList.get(i); - if (viewGroup.getTag() != null && ((Integer) viewGroup.getTag()) == uid) - { + if (viewGroup.getTag() != null && ((Integer) viewGroup.getTag()) == uid) { removeSurfaceView(viewGroup); } } } - private void removeSurfaceViewByIndex(int index) - { + private void removeSurfaceViewByIndex(int index) { ViewGroup viewGroup = viewList.get(index); - if (viewGroup != null) - { + if (viewGroup != null) { removeSurfaceView(viewGroup); } } - private void removeSurfaceView(ViewGroup viewGroup) - { + private void removeSurfaceView(ViewGroup viewGroup) { ViewGroup surfaceContainer = viewGroup.findViewById(R.id.fl_remote); surfaceContainer.removeAllViews(); @@ -658,25 +618,27 @@ private void removeSurfaceView(ViewGroup viewGroup) uidTextView.setText(""); } - public void notifyBroadcaster(int index, boolean exists) - { + /** + * Notify broadcaster. + * + * @param index the index + * @param exists the exists + */ + public void notifyBroadcaster(int index, boolean exists) { ViewGroup viewGroup = viewList.get(index); - if (viewGroup != null) - { + if (viewGroup != null) { TextView textView = viewGroup.findViewById(R.id.noBroadcaster); textView.setVisibility(exists ? View.GONE : View.VISIBLE); } } @Override - public boolean isViewFromObject(@NonNull View view, @NonNull Object object) - { + public boolean isViewFromObject(@NonNull View view, @NonNull Object object) { return view == object; } @Override - public CharSequence getPageTitle(int position) - { + public CharSequence getPageTitle(int position) { return ""; } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VoiceEffects.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VoiceEffects.java index 5fc4a79aa..852a02b70 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VoiceEffects.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/VoiceEffects.java @@ -89,6 +89,9 @@ import io.agora.rtc2.RtcEngineConfig; import io.agora.rtc2.proxy.LocalAccessPointConfiguration; +/** + * The type Voice effects. + */ @Example( index = 4, group = ADVANCED, @@ -238,30 +241,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -288,7 +291,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -312,15 +315,14 @@ public void onClick(View v) { AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; resetControlLayoutByJoined(); - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -389,7 +391,7 @@ private int getPitch2Value(String str) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // audio config engine.setAudioProfile( @@ -397,13 +399,13 @@ private void joinChannel(String channelId) { Constants.AudioScenario.getValue(Constants.AudioScenario.valueOf(audioScenario.getSelectedItem().toString())) ); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, accessToken -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ ChannelMediaOptions option = new ChannelMediaOptions(); @@ -588,7 +590,7 @@ public void onItemSelected(AdapterView parent, View view, int position, long for (Spinner spinner : voiceBeautifierSpinner) { if (spinner != parent) { - if(spinner.getSelectedItemPosition() != 0){ + if (spinner.getSelectedItemPosition() != 0) { spinner.setTag("reset"); spinner.setSelection(0); } @@ -605,33 +607,33 @@ public void onItemSelected(AdapterView parent, View view, int position, long for (Spinner spinner : audioEffectSpinner) { if (spinner != parent) { - if(spinner.getSelectedItemPosition() != 0){ + if (spinner.getSelectedItemPosition() != 0) { spinner.setTag("reset"); spinner.setSelection(0); } } } - _voice3DLayout.setVisibility(audioEffectPreset == ROOM_ACOUSTICS_3D_VOICE ? View.VISIBLE: View.GONE); + _voice3DLayout.setVisibility(audioEffectPreset == ROOM_ACOUSTICS_3D_VOICE ? View.VISIBLE : View.GONE); _pitchModeLayout.setVisibility(audioEffectPreset == PITCH_CORRECTION ? View.VISIBLE : View.GONE); _pitchValueLayout.setVisibility(audioEffectPreset == PITCH_CORRECTION ? View.VISIBLE : View.GONE); return; } - if(parent == voiceConversion){ + if (parent == voiceConversion) { String item = parent.getSelectedItem().toString(); engine.setVoiceConversionPreset(getVoiceConversionValue(item)); return; } - if(parent == _pitchModeOption || parent == _pitchValueOption){ + if (parent == _pitchModeOption || parent == _pitchValueOption) { int effectOption1 = getPitch1Value(_pitchModeOption.getSelectedItem().toString()); int effectOption2 = getPitch2Value(_pitchValueOption.getSelectedItem().toString()); engine.setAudioEffectParameters(PITCH_CORRECTION, effectOption1, effectOption2); } - if(parent == ainsMode){ + if (parent == ainsMode) { boolean enable = position > 0; /* The AI noise suppression modes: @@ -672,7 +674,7 @@ private int getVoiceConversionValue(String label) { return VOICE_CHANGER_DARTH_VADER; case "VOICE_CHANGER_IRON_LADY": return VOICE_CHANGER_IRON_LADY; - case "VOICE_CHANGER_SHIN_CHAN": + case "VOICE_CHANGER_SHIN_CHAN": return VOICE_CHANGER_SHIN_CHAN; case "VOICE_CHANGER_GIRLISH_MAN": return VOICE_CHANGER_GIRLISH_MAN; @@ -807,11 +809,11 @@ public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { if (!fromUser) { return; } - if(seekBar == _voice3DCircle){ + if (seekBar == _voice3DCircle) { int cicle = (int) (1 + 59 * progress * 1.0f / seekBar.getMax()); // [1,60], 10 default engine.setAudioEffectParameters(ROOM_ACOUSTICS_3D_VOICE, cicle, 0); - }else if(seekBar == customPitch){ + } else if (seekBar == customPitch) { double pitch = 0.5 + 1.5 * progress * 1.0f / seekBar.getMax(); // pitch: [0.5,2.0], 1.0 default engine.setLocalVoicePitch(pitch); @@ -827,11 +829,11 @@ public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) { // AUDIO_REVERB_ROOM_SIZE(2)锛歔0, 100] dB // AUDIO_REVERB_WET_DELAY(3)锛歐et signal, [0, 200] ms // AUDIO_REVERB_STRENGTH(4)锛 [0, 100] - if(reverbKey == Constants.AUDIO_REVERB_TYPE.AUDIO_REVERB_DRY_LEVEL || reverbKey == Constants.AUDIO_REVERB_TYPE.AUDIO_REVERB_WET_LEVEL){ + if (reverbKey == Constants.AUDIO_REVERB_TYPE.AUDIO_REVERB_DRY_LEVEL || reverbKey == Constants.AUDIO_REVERB_TYPE.AUDIO_REVERB_WET_LEVEL) { value = (int) (-20 + 30 * progress * 1.0f / seekBar.getMax()); - }else if(reverbKey == Constants.AUDIO_REVERB_TYPE.AUDIO_REVERB_WET_DELAY){ + } else if (reverbKey == Constants.AUDIO_REVERB_TYPE.AUDIO_REVERB_WET_DELAY) { value = (int) (200 * progress * 1.0f / seekBar.getMax()); - }else { + } else { value = (int) (100 * progress * 1.0f / seekBar.getMax()); } engine.setLocalVoiceReverb(reverbKey, value); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeauty.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeauty.java index 8bb8f11d2..42503bb14 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeauty.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeauty.java @@ -36,6 +36,9 @@ import io.agora.rtc2.video.ColorEnhanceOptions; import io.agora.rtc2.video.VideoCanvas; +/** + * The type Byte dance beauty. + */ public class ByteDanceBeauty extends BaseFragment { private static final String TAG = "SceneTimeBeauty"; private static final Matrix IDENTITY_MATRIX = new Matrix(); @@ -75,32 +78,20 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat initVideoView(); initRtcEngine(); joinChannel(); - mBinding.switchVideoEffect.setOnCheckedChangeListener((buttonView, isChecked) -> - { + mBinding.switchVideoEffect.setOnCheckedChangeListener((buttonView, isChecked) -> { ColorEnhanceOptions options = new ColorEnhanceOptions(); options.strengthLevel = (float) 0.5f; options.skinProtectLevel = (float) 0.5f; rtcEngine.setColorEnhanceOptions(isChecked, options); }); - byteDanceBeautyAPI.initialize(new Config( - requireContext(), - rtcEngine, - ByteDanceBeautySDK.INSTANCE.getRenderManager(), - new EventCallback(beautyStats -> null, - () -> { - ByteDanceBeautySDK.INSTANCE.initEffect(requireContext()); - return null; - }, - () -> { - ByteDanceBeautySDK.INSTANCE.unInitEffect(); - return null; - }), - CaptureMode.Agora, - 0, - false, - new CameraConfig() - )); + byteDanceBeautyAPI.initialize(new Config(requireContext(), rtcEngine, ByteDanceBeautySDK.INSTANCE.getRenderManager(), new EventCallback(beautyStats -> null, () -> { + ByteDanceBeautySDK.INSTANCE.initEffect(requireContext()); + return null; + }, () -> { + ByteDanceBeautySDK.INSTANCE.unInitEffect(); + return null; + }), CaptureMode.Agora, 0, false, new CameraConfig())); byteDanceBeautyAPI.enable(true); } @@ -124,28 +115,21 @@ protected void onBackPressed() { private void initVideoView() { mBinding.cbFaceBeautify.setOnCheckedChangeListener((buttonView, isChecked) -> { byteDanceBeautyAPI.setBeautyPreset(isChecked ? BeautyPreset.DEFAULT : BeautyPreset.CUSTOM, - ByteDanceBeautySDK.INSTANCE.getBeautyNodePath(), ByteDanceBeautySDK.INSTANCE.getBeauty4ItemsNodePath(), + ByteDanceBeautySDK.INSTANCE.getBeautyNodePath(), + ByteDanceBeautySDK.INSTANCE.getBeauty4ItemsNodePath(), ByteDanceBeautySDK.INSTANCE.getReSharpNodePath()); }); mBinding.cbMakeup.setOnCheckedChangeListener((buttonView, isChecked) -> { RenderManager renderManager = ByteDanceBeautySDK.INSTANCE.getRenderManager(); - renderManager.appendComposerNodes( - new String[]{ByteDanceBeautySDK.INSTANCE.getMakeupTianmeiNodePath()} - ); - renderManager.updateComposerNodes( - ByteDanceBeautySDK.INSTANCE.getMakeupTianmeiNodePath(), - "Filter_ALL", - isChecked ? 0.5f : 0.f); - renderManager.updateComposerNodes( - ByteDanceBeautySDK.INSTANCE.getMakeupTianmeiNodePath(), - "Makeup_ALL", - isChecked ? 0.5f : 0f); + renderManager.appendComposerNodes(new String[]{ByteDanceBeautySDK.INSTANCE.getMakeupTianmeiNodePath()}); + renderManager.updateComposerNodes(ByteDanceBeautySDK.INSTANCE.getMakeupTianmeiNodePath(), "Filter_ALL", isChecked ? 0.5f : 0.f); + renderManager.updateComposerNodes(ByteDanceBeautySDK.INSTANCE.getMakeupTianmeiNodePath(), "Makeup_ALL", isChecked ? 0.5f : 0f); }); mBinding.cbSticker.setOnCheckedChangeListener((buttonView, isChecked) -> { RenderManager renderManager = ByteDanceBeautySDK.INSTANCE.getRenderManager(); - if(isChecked){ + if (isChecked) { renderManager.setSticker(ByteDanceBeautySDK.INSTANCE.getStickerPath() + "/wochaotian"); - }else { + } else { renderManager.setSticker(null); } }); @@ -283,7 +267,7 @@ private void updateVideoLayouts(boolean isLocalFull) { if (parent instanceof ViewGroup && parent != mBinding.smallVideoContainer) { ((ViewGroup) parent).removeView(mRemoteVideoLayout); mBinding.smallVideoContainer.addView(mRemoteVideoLayout); - } else if(parent == null){ + } else if (parent == null) { mBinding.smallVideoContainer.addView(mRemoteVideoLayout); } } @@ -294,7 +278,7 @@ private void updateVideoLayouts(boolean isLocalFull) { if (parent instanceof ViewGroup && parent != mBinding.smallVideoContainer) { ((ViewGroup) parent).removeView(mLocalVideoLayout); mBinding.smallVideoContainer.addView(mLocalVideoLayout); - } else if(parent == null){ + } else if (parent == null) { mBinding.smallVideoContainer.addView(mLocalVideoLayout); } } @@ -304,7 +288,7 @@ private void updateVideoLayouts(boolean isLocalFull) { if (parent instanceof ViewGroup && parent != mBinding.fullVideoContainer) { ((ViewGroup) parent).removeView(mRemoteVideoLayout); mBinding.fullVideoContainer.addView(mRemoteVideoLayout); - } else if(parent == null) { + } else if (parent == null) { mBinding.fullVideoContainer.addView(mRemoteVideoLayout); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeautySDK.kt b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeautySDK.kt index b8a0a9285..d8eb4b274 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeautySDK.kt +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/ByteDanceBeautySDK.kt @@ -7,24 +7,59 @@ import com.bytedance.labcv.effectsdk.RenderManager import io.agora.api.example.utils.FileKtUtils import java.util.concurrent.Executors +/** + * Byte dance beauty s d k + * + * @constructor Create empty Byte dance beauty s d k + */ object ByteDanceBeautySDK { private val TAG = "ByteDanceBeautySDK" - private val LICENSE_NAME = "Agora_test_20230815_20231115_io.agora.test.entfull_4.5.0_599.licbag" + private val LICENSE_NAME = "Agora_test_20231116_20240116_io.agora.test.entfull_4.5.0_893.licbag" private val workerThread = Executors.newSingleThreadExecutor() private var context: Application? = null private var storagePath = "" private var assetsPath = "" + /** + * Render manager + */ val renderManager = RenderManager() + + /** + * License path + */ var licensePath = "" + + /** + * Models path + */ var modelsPath = "" + + /** + * Beauty node path + */ var beautyNodePath = "" + + /** + * Beauty4items node path + */ var beauty4ItemsNodePath = "" + + /** + * Re sharp node path + */ var reSharpNodePath = "" + + /** + * Sticker path + */ var stickerPath = "" + /** + * Makeup tianmei node path + */ var makeupTianmeiNodePath = "" get() { if(field.isEmpty()){ @@ -36,6 +71,10 @@ object ByteDanceBeautySDK { } return field } + + /** + * Makeup yuan qi node path + */ var makeupYuanQiNodePath = "" get() { if(field.isEmpty()){ @@ -48,6 +87,11 @@ object ByteDanceBeautySDK { return field } + /** + * Init beauty s d k + * + * @param context + */ fun initBeautySDK(context: Context){ this.context = context.applicationContext as? Application storagePath = context.getExternalFilesDir("")?.absolutePath ?: return @@ -81,7 +125,11 @@ object ByteDanceBeautySDK { } } - // GL Thread + /** + * Init effect + * + * @param context + */// GL Thread fun initEffect(context: Context){ val ret = renderManager.init( context, @@ -96,7 +144,10 @@ object ByteDanceBeautySDK { renderManager.loadResourceWithTimeout(-1) } - // GL Thread + /** + * Un init effect + * + */// GL Thread fun unInitEffect(){ renderManager.release() } @@ -110,6 +161,25 @@ object ByteDanceBeautySDK { return true } + /** + * Set beauty + * + * @param smooth + * @param whiten + * @param thinFace + * @param enlargeEye + * @param redden + * @param shrinkCheekbone + * @param shrinkJawbone + * @param whiteTeeth + * @param hairlineHeight + * @param narrowNose + * @param mouthSize + * @param chinLength + * @param brightEye + * @param darkCircles + * @param nasolabialFolds + */ fun setBeauty( smooth: Float? = null, whiten: Float? = null, diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeauty.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeauty.java index b260f7c4b..f88110c4d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeauty.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeauty.java @@ -44,8 +44,10 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Face unity beauty. + */ public class FaceUnityBeauty extends BaseFragment { - private static final String TAG = "SceneTimeBeauty"; private FragmentBeautyFaceunityBinding mBinding; private RtcEngine rtcEngine; @@ -77,20 +79,10 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat initRtcEngine(); - faceUnityBeautyAPI.initialize(new Config( - requireContext(), - rtcEngine, - FaceUnityBeautySDK.INSTANCE.getFuRenderKit(), - null, - CaptureMode.Agora, - 0, - false, - new CameraConfig() - )); + faceUnityBeautyAPI.initialize(new Config(requireContext(), rtcEngine, FaceUnityBeautySDK.INSTANCE.getFuRenderKit(), null, CaptureMode.Agora, 0, false, new CameraConfig())); faceUnityBeautyAPI.enable(true); joinChannel(); - mBinding.switchVideoEffect.setOnCheckedChangeListener((buttonView, isChecked) -> - { + mBinding.switchVideoEffect.setOnCheckedChangeListener((buttonView, isChecked) -> { ColorEnhanceOptions options = new ColorEnhanceOptions(); options.strengthLevel = 0.5f; options.skinProtectLevel = 0.5f; @@ -117,7 +109,7 @@ protected void onBackPressed() { private void initVideoView() { mBinding.cbFaceBeautify.setOnCheckedChangeListener((buttonView, isChecked) -> { - faceUnityBeautyAPI.setBeautyPreset(isChecked? BeautyPreset.DEFAULT: BeautyPreset.CUSTOM); + faceUnityBeautyAPI.setBeautyPreset(isChecked ? BeautyPreset.DEFAULT : BeautyPreset.CUSTOM); }); mBinding.cbMakeup.setOnCheckedChangeListener((buttonView, isChecked) -> { FURenderKit fuRenderKit = FaceUnityBeautySDK.INSTANCE.getFuRenderKit(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeautySDK.kt b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeautySDK.kt index 775139094..a9d041238 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeautySDK.kt +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/FaceUnityBeautySDK.kt @@ -15,10 +15,19 @@ import com.faceunity.wrapper.faceunity import java.io.File import java.util.concurrent.Executors +/** + * Face unity beauty s d k + * + * @constructor Create empty Face unity beauty s d k + */ object FaceUnityBeautySDK { private val TAG = "FaceUnityBeautySDK" private val fuAIKit = FUAIKit.getInstance() + + /** + * Fu render kit + */ val fuRenderKit = FURenderKit.getInstance() /* AI閬撳叿*/ @@ -27,6 +36,11 @@ object FaceUnityBeautySDK { private val workerThread = Executors.newSingleThreadExecutor() + /** + * Init beauty + * + * @param context + */ fun initBeauty(context: Context) { FURenderManager.setKitDebug(FULogger.LogLevel.TRACE) FURenderManager.setCoreDebug(FULogger.LogLevel.ERROR) @@ -51,18 +65,51 @@ object FaceUnityBeautySDK { }) } + /** + * Release + * + */ fun release() { FURenderKit.getInstance().release() } + /** + * Get auth + * + * @return + */ private fun getAuth(): ByteArray { - val authpack = Class.forName("io.agora.api.example.examples.advanced.beauty.authpack") - val aMethod = authpack.getDeclaredMethod("A") - aMethod.isAccessible = true - val authValue = aMethod.invoke(null) as? ByteArray - return authValue ?: ByteArray(0) + try { + val authpack = Class.forName("io.agora.api.example.examples.advanced.beauty.authpack") + val aMethod = authpack.getDeclaredMethod("A") + aMethod.isAccessible = true + val authValue = aMethod.invoke(null) as? ByteArray + return authValue ?: ByteArray(0) + } catch (e: Exception){ + Log.e(TAG, "getAuth >> error : $e") + } + return ByteArray(0) } + /** + * Set beauty + * + * @param smooth + * @param whiten + * @param thinFace + * @param enlargeEye + * @param redden + * @param shrinkCheekbone + * @param shrinkJawbone + * @param whiteTeeth + * @param hairlineHeight + * @param narrowNose + * @param mouthSize + * @param chinLength + * @param brightEye + * @param darkCircles + * @param nasolabialFolds + */ fun setBeauty( smooth: Double? = null, whiten: Double? = null, diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeauty.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeauty.java index e32c3fa95..fc4bb8b41 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeauty.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeauty.java @@ -40,6 +40,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; +/** + * The type Sense time beauty. + */ public class SenseTimeBeauty extends BaseFragment { private static final String TAG = "SceneTimeBeauty"; @@ -81,8 +84,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat initVideoView(); initRtcEngine(); joinChannel(); - mBinding.switchVideoEffect.setOnCheckedChangeListener((buttonView, isChecked) -> - { + mBinding.switchVideoEffect.setOnCheckedChangeListener((buttonView, isChecked) -> { ColorEnhanceOptions options = new ColorEnhanceOptions(); options.strengthLevel = (float) 0.5f; options.skinProtectLevel = (float) 0.5f; @@ -127,32 +129,19 @@ protected void onBackPressed() { private void initVideoView() { mBinding.cbFaceBeautify.setOnCheckedChangeListener((buttonView, isChecked) -> { - senseTimeBeautyAPI.setBeautyPreset(isChecked? BeautyPreset.DEFAULT: BeautyPreset.CUSTOM); + senseTimeBeautyAPI.setBeautyPreset(isChecked ? BeautyPreset.DEFAULT : BeautyPreset.CUSTOM); }); mBinding.cbMakeup.setOnCheckedChangeListener((buttonView, isChecked) -> { - if(isChecked){ - SenseTimeBeautySDK.INSTANCE.setMakeUpItem( - requireContext(), - STEffectBeautyType.EFFECT_BEAUTY_MAKEUP_ALL, - "makeup_lip" + File.separator + "12鑷劧.zip", - 1.0f - ); - }else{ - SenseTimeBeautySDK.INSTANCE.setMakeUpItem( - requireContext(), - STEffectBeautyType.EFFECT_BEAUTY_MAKEUP_ALL, - "", 0.0f - ); + if (isChecked) { + SenseTimeBeautySDK.INSTANCE.setMakeUpItem(requireContext(), STEffectBeautyType.EFFECT_BEAUTY_MAKEUP_ALL, "makeup_lip" + File.separator + "12鑷劧.zip", 1.0f); + } else { + SenseTimeBeautySDK.INSTANCE.setMakeUpItem(requireContext(), STEffectBeautyType.EFFECT_BEAUTY_MAKEUP_ALL, "", 0.0f); } }); mBinding.cbSticker.setOnCheckedChangeListener((buttonView, isChecked) -> { - if(isChecked){ - SenseTimeBeautySDK.INSTANCE.setStickerItem( - requireContext(), - "sticker_face_shape" + File.separator + "ShangBanLe.zip", - true - ); - }else{ + if (isChecked) { + SenseTimeBeautySDK.INSTANCE.setStickerItem(requireContext(), "sticker_face_shape" + File.separator + "ShangBanLe.zip", true); + } else { SenseTimeBeautySDK.INSTANCE.cleanSticker(); } }); @@ -258,7 +247,6 @@ public void onRemoteVideoStats(RemoteVideoStats stats) { } - private void joinChannel() { int uid = new Random(System.currentTimeMillis()).nextInt(1000) + 10000; ChannelMediaOptions options = new ChannelMediaOptions(); @@ -300,7 +288,7 @@ private void updateVideoLayouts(boolean isLocalFull) { if (parent instanceof ViewGroup && parent != mBinding.smallVideoContainer) { ((ViewGroup) parent).removeView(mRemoteVideoLayout); mBinding.smallVideoContainer.addView(mRemoteVideoLayout); - } else if(parent == null){ + } else if (parent == null) { mBinding.smallVideoContainer.addView(mRemoteVideoLayout); } } @@ -311,7 +299,7 @@ private void updateVideoLayouts(boolean isLocalFull) { if (parent instanceof ViewGroup && parent != mBinding.smallVideoContainer) { ((ViewGroup) parent).removeView(mLocalVideoLayout); mBinding.smallVideoContainer.addView(mLocalVideoLayout); - } else if(parent == null){ + } else if (parent == null) { mBinding.smallVideoContainer.addView(mLocalVideoLayout); } } @@ -321,7 +309,7 @@ private void updateVideoLayouts(boolean isLocalFull) { if (parent instanceof ViewGroup && parent != mBinding.fullVideoContainer) { ((ViewGroup) parent).removeView(mRemoteVideoLayout); mBinding.fullVideoContainer.addView(mRemoteVideoLayout); - } else if(parent == null) { + } else if (parent == null) { mBinding.fullVideoContainer.addView(mRemoteVideoLayout); } } @@ -329,5 +317,4 @@ private void updateVideoLayouts(boolean isLocalFull) { } - } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeautySDK.kt b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeautySDK.kt index 367aa4472..fe28d394c 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeautySDK.kt +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/beauty/SenseTimeBeautySDK.kt @@ -12,6 +12,11 @@ import com.softsugar.stmobile.params.STHumanActionParamsType import io.agora.api.example.utils.FileKtUtils import java.util.concurrent.Executors +/** + * Sense time beauty s d k + * + * @constructor Create empty Sense time beauty s d k + */ object SenseTimeBeautySDK { private val TAG = "SenseTimeBeautySDK" @@ -28,24 +33,36 @@ object SenseTimeBeautySDK { private val MODEL_SEGMENT_HAIR = "models/M_SenseME_Segment_Hair_p_4.4.0.model" // 澶村彂鍒嗗壊 private val MODEL_FACE_OCCLUSION = "models/M_SenseME_FaceOcclusion_p_1.0.7.1.model" // 濡嗗閬尅 private val MODEL_SEGMENT_SKY = "models/M_SenseME_Segment_Sky_p_1.1.0.1.model" // 澶╃┖鍒嗗壊 - private val MODEL_SEGMENT_SKIN = "models/M_SenseME_Segment_Skin_p_1.0.1.1.model" // 鐨偆鍒嗗壊 + // private val MODEL_SEGMENT_SKIN = "models/M_SenseME_Segment_Skin_p_1.0.1.1.model" // 鐨偆鍒嗗壊 private val MODEL_3DMESH = "models/M_SenseME_3DMesh_Face2396pt_280kpts_Ear_p_1.1.0v2.model" // 3DMesh - private val MODEL_HEAD_P_EAR = "models/M_SenseME_Ear_p_1.0.1.1.model" // 鎼厤 mesh 鑰虫湹妯″瀷 + // private val MODEL_HEAD_P_EAR = "models/M_SenseME_Ear_p_1.0.1.1.model" // 鎼厤 mesh 鑰虫湹妯″瀷 private val MODEL_360HEAD_INSTANCE = "models/M_SenseME_3Dmesh_360Head2396pt_p_1.0.0.1.model" // 360搴︿汉澶磎esh private val MODEL_FOOT = "models/M_SenseME_Foot_p_2.10.7.model" // 闉嬪瓙妫娴嬫ā鍨 private val MODEL_PANT = "models/M_SenseME_Segment_Trousers_p_1.1.10.model" // 瑁よ吙鐨勬娴 private val MODEL_WRIST = "models/M_SenseME_Wrist_p_1.4.0.model" // 璇曡〃 private val MODEL_CLOTH = "models/M_SenseME_Segment_Clothes_p_1.0.2.2.model" // 琛f湇鍒嗗壊 private val MODEL_HEAD_INSTANCE = "models/M_SenseME_Segment_Head_Instance_p_1.1.0.1.model" // 瀹炰緥鍒嗗壊鐗堟湰 - private val MODEL_HEAD_P_INSTANCE = "models/M_SenseME_Head_p_1.3.0.1.model" // 360搴︿汉澶-澶撮儴妯″瀷 + // private val MODEL_HEAD_P_INSTANCE = "models/M_SenseME_Head_p_1.3.0.1.model" // 360搴︿汉澶-澶撮儴妯″瀷 private val MODEL_NAIL = "models/M_SenseME_Nail_p_2.4.0.model" // 鎸囩敳妫娴 private val workerThread = Executors.newSingleThreadExecutor() + /** + * Mobile effect native + */ val mobileEffectNative = STMobileEffectNative() + + /** + * Human action native + */ val humanActionNative = STMobileHumanActionNative() + /** + * Init beauty s d k + * + * @param context + */ fun initBeautySDK(context: Context){ workerThread.submit { checkLicense(context) @@ -53,6 +70,11 @@ object SenseTimeBeautySDK { } } + /** + * Init mobile effect + * + * @param context + */ fun initMobileEffect(context: Context){ val result = mobileEffectNative.createInstance(context, STMobileEffectNative.EFFECT_CONFIG_NONE) @@ -60,10 +82,18 @@ object SenseTimeBeautySDK { Log.d(TAG, "SenseTime >> STMobileEffectNative create result : $result") } + /** + * Un init mobile effect + * + */ fun unInitMobileEffect(){ mobileEffectNative.destroyInstance() } + /** + * Release + * + */ fun release() { mobileEffectNative.destroyInstance() } @@ -126,6 +156,14 @@ object SenseTimeBeautySDK { humanActionNative.setParam(STHumanActionParamsType.ST_HUMAN_ACTION_PARAM_HEAD_SEGMENT_INSTANCE, 1.0f) } + /** + * Set make up item + * + * @param context + * @param type + * @param path + * @param strength + */ fun setMakeUpItem(context: Context, type: Int, path: String = "", strength: Float = 1.0f) { if (path.isNotEmpty()) { val assets = context.assets @@ -136,6 +174,13 @@ object SenseTimeBeautySDK { } } + /** + * Set sticker item + * + * @param context + * @param path + * @param attach + */ fun setStickerItem(context: Context, path: String, attach: Boolean) { if(attach){ val assets = context.assets @@ -147,6 +192,10 @@ object SenseTimeBeautySDK { } } + /** + * Clean sticker + * + */ fun cleanSticker(){ packageMap.values.forEach { mobileEffectNative.removeEffect(it) @@ -154,6 +203,29 @@ object SenseTimeBeautySDK { packageMap.clear() } + /** + * Set beauty + * + * @param smooth + * @param whiten + * @param thinFace + * @param enlargeEye + * @param redden + * @param shrinkCheekbone + * @param shrinkJawbone + * @param whiteTeeth + * @param hairlineHeight + * @param narrowNose + * @param mouthSize + * @param chinLength + * @param brightEye + * @param darkCircles + * @param nasolabialFolds + * @param saturation + * @param contrast + * @param sharpen + * @param clear + */ fun setBeauty( smooth: Float? = null, whiten: Float? = null, diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/AudioPlayer.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/AudioPlayer.java index 84c814cef..34ee4379c 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/AudioPlayer.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/AudioPlayer.java @@ -4,31 +4,43 @@ import android.media.AudioTrack; import android.util.Log; +/** + * The type Audio player. + */ public class AudioPlayer { private static final int DEFAULT_PLAY_MODE = AudioTrack.MODE_STREAM; - private static final String TAG = "AudioPlayer"; + private static final String TAG = "AudioPlayer"; private AudioTrack mAudioTrack; - private AudioStatus mAudioStatus = AudioStatus.STOPPED ; + private AudioStatus mAudioStatus = AudioStatus.STOPPED; - public AudioPlayer(int streamType, int sampleRateInHz, int channelConfig, int audioFormat){ - if(mAudioStatus == AudioStatus.STOPPED) { - int Val = 0; - if(1 == channelConfig) - Val = AudioFormat.CHANNEL_OUT_MONO; - else if(2 == channelConfig) - Val = AudioFormat.CHANNEL_OUT_STEREO; - else - Log.e(TAG, "channelConfig is wrong !"); + /** + * Instantiates a new Audio player. + * + * @param streamType the stream type + * @param sampleRateInHz the sample rate in hz + * @param channelConfig the channel config + * @param audioFormat the audio format + */ + public AudioPlayer(int streamType, int sampleRateInHz, int channelConfig, int audioFormat) { + if (mAudioStatus == AudioStatus.STOPPED) { + int format = 0; + if (1 == channelConfig) { + format = AudioFormat.CHANNEL_OUT_MONO; + } else if (2 == channelConfig) { + format = AudioFormat.CHANNEL_OUT_STEREO; + } else { + Log.e(TAG, "channelConfig is wrong !"); + } - int mMinBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz, Val, audioFormat); + int mMinBufferSize = AudioTrack.getMinBufferSize(sampleRateInHz, format, audioFormat); Log.e(TAG, " sampleRateInHz :" + sampleRateInHz + " channelConfig :" + channelConfig + " audioFormat: " + audioFormat + " mMinBufferSize: " + mMinBufferSize); if (mMinBufferSize == AudioTrack.ERROR_BAD_VALUE) { - Log.e(TAG,"AudioTrack.ERROR_BAD_VALUE : " + AudioTrack.ERROR_BAD_VALUE) ; + Log.e(TAG, "AudioTrack.ERROR_BAD_VALUE : " + AudioTrack.ERROR_BAD_VALUE); } - mAudioTrack = new AudioTrack(streamType, sampleRateInHz, Val, audioFormat, mMinBufferSize, DEFAULT_PLAY_MODE); + mAudioTrack = new AudioTrack(streamType, sampleRateInHz, format, audioFormat, mMinBufferSize, DEFAULT_PLAY_MODE); if (mAudioTrack.getState() == AudioTrack.STATE_UNINITIALIZED) { throw new RuntimeException("Error on AudioTrack created"); } @@ -37,8 +49,13 @@ else if(2 == channelConfig) Log.e(TAG, "mAudioStatus: " + mAudioStatus); } + /** + * Start player boolean. + * + * @return the boolean + */ public boolean startPlayer() { - if(mAudioStatus == AudioStatus.INITIALISING) { + if (mAudioStatus == AudioStatus.INITIALISING) { mAudioTrack.play(); mAudioStatus = AudioStatus.RUNNING; } @@ -46,8 +63,11 @@ public boolean startPlayer() { return true; } + /** + * Stop player. + */ public void stopPlayer() { - if(null != mAudioTrack){ + if (null != mAudioTrack) { mAudioStatus = AudioStatus.STOPPED; mAudioTrack.stop(); mAudioTrack.release(); @@ -56,18 +76,38 @@ public void stopPlayer() { Log.e(TAG, "mAudioStatus: " + mAudioStatus); } + /** + * Play boolean. + * + * @param audioData the audio data + * @param offsetInBytes the offset in bytes + * @param sizeInBytes the size in bytes + * @return the boolean + */ public boolean play(byte[] audioData, int offsetInBytes, int sizeInBytes) { - if(mAudioStatus == AudioStatus.RUNNING) { + if (mAudioStatus == AudioStatus.RUNNING) { mAudioTrack.write(audioData, offsetInBytes, sizeInBytes); - }else{ + } else { Log.e(TAG, "=== No data to AudioTrack !! mAudioStatus: " + mAudioStatus); } return true; } + /** + * The enum Audio status. + */ public enum AudioStatus { + /** + * Initialising audio status. + */ INITIALISING, + /** + * Running audio status. + */ RUNNING, + /** + * Stopped audio status. + */ STOPPED } } \ No newline at end of file diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java index f88ea322d..479854f9a 100755 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioRender.java @@ -6,7 +6,6 @@ import android.media.AudioFormat; import android.media.AudioManager; import android.os.Bundle; -import android.os.Handler; import android.os.Process; import android.util.Log; import android.view.LayoutInflater; @@ -41,20 +40,16 @@ /** * This demo demonstrates how to make a one-to-one voice call */ -@Example( - index = 6, - group = ADVANCED, - name = R.string.item_customaudiorender, - actionId = R.id.action_mainFragment_to_CustomAudioRender, - tipsId = R.string.customaudiorender -) +@Example(index = 6, group = ADVANCED, name = R.string.item_customaudiorender, actionId = R.id.action_mainFragment_to_CustomAudioRender, tipsId = R.string.customaudiorender) public class CustomAudioRender extends BaseFragment implements View.OnClickListener { private static final String TAG = CustomAudioRender.class.getSimpleName(); private EditText et_channel; private Button join; private boolean joined = false; + /** + * The constant engine. + */ public static RtcEngineEx engine; - private ChannelMediaOptions option = new ChannelMediaOptions(); private static final Integer SAMPLE_RATE = 44100; private static final Integer SAMPLE_NUM_OF_CHANNEL = 2; @@ -69,22 +64,6 @@ public class CustomAudioRender extends BaseFragment implements View.OnClickListe private AudioSeatManager audioSeatManager; - @Override - public void onCreate(@Nullable Bundle savedInstanceState) { - super.onCreate(savedInstanceState); - handler = new Handler(); - initMediaOption(); - } - - private void initMediaOption() { - option.autoSubscribeAudio = true; - option.autoSubscribeVideo = true; - option.publishMicrophoneTrack = true; - option.publishCustomAudioTrack = false; - option.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; - option.enableAudioRecordingOrPlayout = true; - } - @Nullable @Override @@ -109,8 +88,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat view.findViewById(R.id.audio_place_06), view.findViewById(R.id.audio_place_07), view.findViewById(R.id.audio_place_08), - view.findViewById(R.id.audio_place_09) - ); + view.findViewById(R.id.audio_place_09)); } @Override @@ -123,30 +101,30 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -165,9 +143,9 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { engine.setLocalAccessPoint(localAccessPointConfiguration); } - engine.setExternalAudioSource(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL); - - audioPlayer = new AudioPlayer(AudioManager.STREAM_MUSIC, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, + audioPlayer = new AudioPlayer(AudioManager.STREAM_MUSIC, + SAMPLE_RATE, + SAMPLE_NUM_OF_CHANNEL, AudioFormat.ENCODING_PCM_16BIT); } catch (Exception e) { e.printStackTrace(); @@ -179,7 +157,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { public void onDestroy() { super.onDestroy(); pulling = false; - if(pullingTask != null){ + if (pullingTask != null) { try { pullingTask.join(); pullingTask = null; @@ -188,7 +166,7 @@ public void onDestroy() { } } audioPlayer.stopPlayer(); - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -197,7 +175,6 @@ public void onDestroy() { } - @Override public void onClick(View v) { if (v.getId() == R.id.btn_join) { @@ -211,17 +188,13 @@ public void onClick(View v) { return; } // Request permission - AndPermission.with(this).runtime().permission( - Permission.Group.STORAGE, - Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + AndPermission.with(this).runtime().permission(Permission.Group.STORAGE, Permission.Group.MICROPHONE).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -242,7 +215,7 @@ public void onClick(View v) { pulling = false; join.setText(getString(R.string.join)); audioSeatManager.downAllSeats(); - if(pullingTask != null){ + if (pullingTask != null) { try { pullingTask.join(); pullingTask = null; @@ -259,35 +232,25 @@ public void onClick(View v) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ - engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); - /**Sets the external audio source. - * @param enabled Sets whether to enable/disable the external audio source: - * true: Enable the external audio source. - * false: (Default) Disable the external audio source. - * @param sampleRate Sets the sample rate (Hz) of the external audio source, which can be - * set as 8000, 16000, 32000, 44100, or 48000 Hz. - * @param channels Sets the number of channels of the external audio source: - * 1: Mono. - * 2: Stereo. - * @return - * 0: Success. - * < 0: Failure. - * PS: Ensure that you call this method before the joinChannel method.*/ - // engine.setExternalAudioSource(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL, 2, false, true); - - - - /**Please configure accessToken in the string_config file. + + engine.setExternalAudioSink(true, SAMPLE_RATE, SAMPLE_NUM_OF_CHANNEL); + + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + ChannelMediaOptions option = new ChannelMediaOptions(); + option.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; + option.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER; + + + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: @@ -337,7 +300,7 @@ public void run() { join.setText(getString(R.string.leave)); pulling = true; audioPlayer.startPlayer(); - if(pullingTask == null){ + if (pullingTask == null) { pullingTask = new Thread(new PullingTask()); pullingTask.start(); } @@ -358,7 +321,13 @@ public void onUserOffline(int uid, int reason) { } }; + /** + * The type Pulling task. + */ class PullingTask implements Runnable { + /** + * The Number. + */ long number = 0; @Override @@ -366,23 +335,18 @@ public void run() { Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO); while (pulling) { Log.i(TAG, "pushExternalAudioFrame times:" + number++); - long before = System.currentTimeMillis(); ByteBuffer frame = ByteBuffer.allocateDirect(BUFFER_SIZE); engine.pullPlaybackAudioFrame(frame, BUFFER_SIZE); byte[] data = new byte[frame.remaining()]; frame.get(data, 0, data.length); - audioPlayer.play(data, 0, BUFFER_SIZE); - long now = System.currentTimeMillis(); - long consuming = now - before; - if(consuming < PULL_INTERVAL){ - try { - Thread.sleep(PULL_INTERVAL - consuming); - } catch (InterruptedException e) { - Log.e(TAG, "PushingTask Interrupted"); - } + // simple audio filter + for (int i = 0; i < data.length; i++) { + data[i] = (byte) (data[i] + 5); } + + audioPlayer.play(data, 0, BUFFER_SIZE); } } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioSource.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioSource.java index bddfecc96..cf4024189 100755 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioSource.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/customaudio/CustomAudioSource.java @@ -40,19 +40,16 @@ /** * This demo demonstrates how to make a one-to-one voice call */ -@Example( - index = 5, - group = ADVANCED, - name = R.string.item_customaudiosource, - actionId = R.id.action_mainFragment_to_CustomAudioSource, - tipsId = R.string.customaudio -) +@Example(index = 5, group = ADVANCED, name = R.string.item_customaudiosource, actionId = R.id.action_mainFragment_to_CustomAudioSource, tipsId = R.string.customaudio) public class CustomAudioSource extends BaseFragment implements View.OnClickListener, CompoundButton.OnCheckedChangeListener { private static final String TAG = CustomAudioSource.class.getSimpleName(); private EditText et_channel; private Button join; private int myUid; private boolean joined = false; + /** + * The constant engine. + */ public static RtcEngineEx engine; private Switch mic, pcm; private ChannelMediaOptions option = new ChannelMediaOptions(); @@ -119,41 +116,37 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = (RtcEngineEx) RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ - engine.setParameters("{" - + "\"rtc.report_app_scenario\":" - + "{" - + "\"appScenario\":" + 100 + "," - + "\"serviceType\":" + 11 + "," - + "\"appVersion\":\"" + RtcEngine.getSdkVersion() + "\"" - + "}" - + "}"); + engine.setParameters("{" + "\"rtc.report_app_scenario\":" + + "{" + "\"appScenario\":" + + 100 + "," + "\"serviceType\":" + 11 + "," + + "\"appVersion\":\"" + RtcEngine.getSdkVersion() + "\"" + "}" + "}"); /* setting the local access point if the private cloud ip was set, otherwise the config will be invalid.*/ LocalAccessPointConfiguration localAccessPointConfiguration = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getPrivateCloudConfig(); if (localAccessPointConfiguration != null) { @@ -162,8 +155,12 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { } audioPushingHelper = new AudioFileReader(requireContext(), (buffer, timestamp) -> { - if(joined && engine != null && customAudioTrack != -1){ - int ret = engine.pushExternalAudioFrame(buffer, timestamp, AudioFileReader.SAMPLE_RATE, AudioFileReader.SAMPLE_NUM_OF_CHANNEL, Constants.BytesPerSample.TWO_BYTES_PER_SAMPLE, customAudioTrack); + if (joined && engine != null && customAudioTrack != -1) { + int ret = engine.pushExternalAudioFrame(buffer, timestamp, + AudioFileReader.SAMPLE_RATE, + AudioFileReader.SAMPLE_NUM_OF_CHANNEL, + Constants.BytesPerSample.TWO_BYTES_PER_SAMPLE, + customAudioTrack); Log.i(TAG, "pushExternalAudioFrame times:" + (++pushTimes) + ", ret=" + ret); } }); @@ -176,14 +173,14 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - if(customAudioTrack != -1){ + if (customAudioTrack != -1) { engine.destroyCustomAudioTrack(customAudioTrack); customAudioTrack = -1; } - if(audioPushingHelper != null){ + if (audioPushingHelper != null) { audioPushingHelper.stop(); } - /**leaveChannel and Destroy the RtcEngine instance*/ + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } @@ -219,17 +216,13 @@ public void onClick(View v) { return; } // Request permission - AndPermission.with(this).runtime().permission( - Permission.Group.STORAGE, - Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + AndPermission.with(this).runtime().permission(Permission.Group.STORAGE, Permission.Group.MICROPHONE).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -252,7 +245,7 @@ public void onClick(View v) { pcm.setEnabled(false); pcm.setChecked(false); mic.setChecked(true); - if(audioPushingHelper != null){ + if (audioPushingHelper != null) { audioPushingHelper.stop(); } audioSeatManager.downAllSeats(); @@ -265,9 +258,9 @@ public void onClick(View v) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); - /**Sets the external audio source. + /*Sets the external audio source. * @param enabled Sets whether to enable/disable the external audio source: * true: Enable the external audio source. * false: (Default) Disable the external audio source. @@ -284,14 +277,14 @@ private void joinChannel(String channelId) { config.enableLocalPlayback = false; customAudioTrack = engine.createCustomAudioTrack(Constants.AudioTrackType.AUDIO_TRACK_MIXABLE, config); - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -343,7 +336,7 @@ public void run() { pcm.setEnabled(true); join.setEnabled(true); join.setText(getString(R.string.leave)); - if(audioPushingHelper != null){ + if (audioPushingHelper != null) { pushTimes = 0; audioPushingHelper.start(); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/GLTextureView.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/GLTextureView.java index 183d71dc7..5ee769b76 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/GLTextureView.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/GLTextureView.java @@ -1,17 +1,30 @@ package io.agora.api.example.examples.advanced.videoRender; +import static android.opengl.EGL14.EGL_BAD_ACCESS; +import static android.opengl.EGL14.EGL_BAD_ALLOC; +import static android.opengl.EGL14.EGL_BAD_ATTRIBUTE; +import static android.opengl.EGL14.EGL_BAD_CONFIG; +import static android.opengl.EGL14.EGL_BAD_CONTEXT; +import static android.opengl.EGL14.EGL_BAD_CURRENT_SURFACE; +import static android.opengl.EGL14.EGL_BAD_DISPLAY; +import static android.opengl.EGL14.EGL_BAD_MATCH; +import static android.opengl.EGL14.EGL_BAD_NATIVE_PIXMAP; +import static android.opengl.EGL14.EGL_BAD_NATIVE_WINDOW; +import static android.opengl.EGL14.EGL_BAD_PARAMETER; +import static android.opengl.EGL14.EGL_BAD_SURFACE; +import static android.opengl.EGL14.EGL_NOT_INITIALIZED; +import static android.opengl.EGL14.EGL_SUCCESS; + import android.content.Context; import android.graphics.SurfaceTexture; import android.opengl.EGL14; import android.opengl.EGLExt; import android.opengl.GLDebugHelper; -import android.opengl.GLSurfaceView; import android.util.AttributeSet; import android.util.Log; import android.view.TextureView; import android.view.View; - import java.io.Writer; import java.lang.ref.WeakReference; import java.util.ArrayList; @@ -25,21 +38,6 @@ import javax.microedition.khronos.opengles.GL; import javax.microedition.khronos.opengles.GL10; -import static android.opengl.EGL14.EGL_BAD_ACCESS; -import static android.opengl.EGL14.EGL_BAD_ALLOC; -import static android.opengl.EGL14.EGL_BAD_ATTRIBUTE; -import static android.opengl.EGL14.EGL_BAD_CONFIG; -import static android.opengl.EGL14.EGL_BAD_CONTEXT; -import static android.opengl.EGL14.EGL_BAD_CURRENT_SURFACE; -import static android.opengl.EGL14.EGL_BAD_DISPLAY; -import static android.opengl.EGL14.EGL_BAD_MATCH; -import static android.opengl.EGL14.EGL_BAD_NATIVE_PIXMAP; -import static android.opengl.EGL14.EGL_BAD_NATIVE_WINDOW; -import static android.opengl.EGL14.EGL_BAD_PARAMETER; -import static android.opengl.EGL14.EGL_BAD_SURFACE; -import static android.opengl.EGL14.EGL_NOT_INITIALIZED; -import static android.opengl.EGL14.EGL_SUCCESS; - /** * 鍙傝 {@link GLSurfaceView} 瀹炵幇 * @@ -58,17 +56,17 @@ public class GLTextureView extends TextureView implements TextureView.SurfaceTex * The renderer only renders * when the surface is created, or when {@link #requestRender} is called. * - * @see #getRenderMode() - * @see #setRenderMode(int) - * @see #requestRender() + * @see #getRenderMode() #getRenderMode() + * @see #setRenderMode(int) #setRenderMode(int) + * @see #requestRender() #requestRender() */ public final static int RENDERMODE_WHEN_DIRTY = 0; /** * The renderer is called * continuously to re-render the scene. * - * @see #getRenderMode() - * @see #setRenderMode(int) + * @see #getRenderMode() #getRenderMode() + * @see #setRenderMode(int) #setRenderMode(int) */ public final static int RENDERMODE_CONTINUOUSLY = 1; @@ -77,27 +75,35 @@ public class GLTextureView extends TextureView implements TextureView.SurfaceTex * that an error has occurred. This can be used to help track down which OpenGL ES call * is causing an error. * - * @see #getDebugFlags - * @see #setDebugFlags + * @see #getDebugFlags #getDebugFlags + * @see #setDebugFlags #setDebugFlags */ public final static int DEBUG_CHECK_GL_ERROR = 1; /** * Log GL calls to the system log at "verbose" level with tag "GLTextureView". * - * @see #getDebugFlags - * @see #setDebugFlags + * @see #getDebugFlags #getDebugFlags + * @see #setDebugFlags #setDebugFlags */ public final static int DEBUG_LOG_GL_CALLS = 2; /** * 鏋勯犳柟娉曪紝蹇呴』璋冪敤 {@link #setRenderer} 鎵嶈兘杩涜娓叉煋 + * + * @param context the context */ public GLTextureView(Context context) { super(context); init(); } + /** + * Instantiates a new Gl texture view. + * + * @param context the context + * @param attributeSet the attribute set + */ public GLTextureView(Context context, AttributeSet attributeSet) { super(context, attributeSet); init(); @@ -145,8 +151,8 @@ public void setGLWrapper(GLWrapper glWrapper) { * whenever a surface is created. The default value is zero. * * @param debugFlags the new debug flags - * @see #DEBUG_CHECK_GL_ERROR - * @see #DEBUG_LOG_GL_CALLS + * @see #DEBUG_CHECK_GL_ERROR #DEBUG_CHECK_GL_ERROR + * @see #DEBUG_LOG_GL_CALLS #DEBUG_LOG_GL_CALLS */ public void setDebugFlags(int debugFlags) { mDebugFlags = debugFlags; @@ -186,6 +192,8 @@ public void setPreserveEGLContextOnPause(boolean preserveOnPause) { } /** + * Gets preserve egl context on pause. + * * @return true if the EGL context will be preserved when paused */ public boolean getPreserveEGLContextOnPause() { @@ -243,6 +251,8 @@ public void setRenderer(Renderer renderer) { * If this method is not called, then by default * a context will be created with no shared context and * with a null attribute list. + * + * @param factory the factory */ public void setEGLContextFactory(EGLContextFactory factory) { checkRenderThreadState(); @@ -257,6 +267,8 @@ public void setEGLContextFactory(EGLContextFactory factory) { *

    * If this method is not called, then by default * a window surface will be created with a null attribute list. + * + * @param factory the factory */ public void setEGLWindowSurfaceFactory(EGLWindowSurfaceFactory factory) { checkRenderThreadState(); @@ -273,6 +285,8 @@ public void setEGLWindowSurfaceFactory(EGLWindowSurfaceFactory factory) { * view will choose an EGLConfig that is compatible with the current * android.view.Surface, with a depth buffer depth of * at least 16 bits. + * + * @param configChooser the config chooser */ public void setEGLConfigChooser(EGLConfigChooser configChooser) { checkRenderThreadState(); @@ -290,6 +304,8 @@ public void setEGLConfigChooser(EGLConfigChooser configChooser) { * If no setEGLConfigChooser method is called, then by default the * view will choose an RGB_888 surface with a depth buffer depth of * at least 16 bits. + * + * @param needDepth the need depth */ public void setEGLConfigChooser(boolean needDepth) { setEGLConfigChooser(new SimpleEGLConfigChooser(needDepth)); @@ -306,6 +322,13 @@ public void setEGLConfigChooser(boolean needDepth) { * If no setEGLConfigChooser method is called, then by default the * view will choose an RGB_888 surface with a depth buffer depth of * at least 16 bits. + * + * @param redSize the red size + * @param greenSize the green size + * @param blueSize the blue size + * @param alphaSize the alpha size + * @param depthSize the depth size + * @param stencilSize the stencil size */ public void setEGLConfigChooser(int redSize, int greenSize, int blueSize, int alphaSize, int depthSize, int stencilSize) { @@ -358,8 +381,8 @@ public void setEGLContextClientVersion(int version) { * This method can only be called after {@link #setRenderer(Renderer)} * * @param renderMode one of the RENDERMODE_X constants - * @see #RENDERMODE_CONTINUOUSLY - * @see #RENDERMODE_WHEN_DIRTY + * @see #RENDERMODE_CONTINUOUSLY #RENDERMODE_CONTINUOUSLY + * @see #RENDERMODE_WHEN_DIRTY #RENDERMODE_WHEN_DIRTY */ public void setRenderMode(int renderMode) { mGLThread.setRenderMode(renderMode); @@ -370,8 +393,8 @@ public void setRenderMode(int renderMode) { * from any thread. Must not be called before a renderer has been set. * * @return the current rendering mode. - * @see #RENDERMODE_CONTINUOUSLY - * @see #RENDERMODE_WHEN_DIRTY + * @see #RENDERMODE_CONTINUOUSLY #RENDERMODE_CONTINUOUSLY + * @see #RENDERMODE_WHEN_DIRTY #RENDERMODE_WHEN_DIRTY */ public int getRenderMode() { return mGLThread.getRenderMode(); @@ -456,7 +479,7 @@ protected void onAttachedToWindow() { if (LOG_ATTACH_DETACH) { Log.d(TAG, "onAttachedToWindow reattach =" + mDetached); } - if (mDetached && (mRenderer != null)) { + if (mDetached && mRenderer != null) { int renderMode = RENDERMODE_CONTINUOUSLY; if (mGLThread != null) { renderMode = mGLThread.getRenderMode(); @@ -482,6 +505,11 @@ protected void onDetachedFromWindow() { super.onDetachedFromWindow(); } + /** + * Gets egl context. + * + * @return the egl context + */ public EGLContext getEglContext() { return mGLThread.getEglContext(); } @@ -529,16 +557,14 @@ public interface GLWrapper { * that it still needs. The {@link #onSurfaceCreated(GL10, EGLConfig)} method * is a convenient place to do this. * - * @see GLTextureView#setRenderer(GLTextureView.Renderer) + * @see GLTextureView#setRenderer(GLTextureView.Renderer) GLTextureView#setRenderer(GLTextureView.Renderer) */ public interface Renderer { /** * Called when the surface is created or recreated. * - * @param gl the GL interface. Use instanceof to - * test if the interface supports GL11 or higher interfaces. - * @param config the EGLConfig of the created surface. Can be used - * to create matching pbuffers. + * @param gl the GL interface. Use instanceof to test if the interface supports GL11 or higher interfaces. + * @param config the EGLConfig of the created surface. Can be used to create matching pbuffers. */ void onSurfaceCreated(GL10 gl, EGLConfig config); @@ -549,8 +575,9 @@ public interface Renderer { * the OpenGL ES surface size changes. *

    * - * @param gl the GL interface. Use instanceof to - * test if the interface supports GL11 or higher interfaces. + * @param gl the GL interface. Use instanceof to test if the interface supports GL11 or higher interfaces. + * @param width the width + * @param height the height */ void onSurfaceChanged(GL10 gl, int width, int height); @@ -560,8 +587,7 @@ public interface Renderer { * This method is responsible for drawing the current frame. *

    * - * @param gl the GL interface. Use instanceof to - * test if the interface supports GL11 or higher interfaces. + * @param gl the GL interface. Use instanceof to test if the interface supports GL11 or higher interfaces. */ void onDrawFrame(GL10 gl); } @@ -573,12 +599,27 @@ public interface Renderer { * {@link GLTextureView#setEGLContextFactory(GLTextureView.EGLContextFactory)} */ public interface EGLContextFactory { + /** + * Create context egl context. + * + * @param egl the egl + * @param display the display + * @param eglConfig the egl config + * @return the egl context + */ EGLContext createContext(EGL10 egl, EGLDisplay display, EGLConfig eglConfig); + /** + * Destroy context. + * + * @param egl the egl + * @param display the display + * @param context the context + */ void destroyContext(EGL10 egl, EGLDisplay display, EGLContext context); } - private class DefaultContextFactory implements GLTextureView.EGLContextFactory { + private final class DefaultContextFactory implements GLTextureView.EGLContextFactory { private static final int EGL_CONTEXT_CLIENT_VERSION = 0x3098; @Override @@ -611,15 +652,28 @@ public void destroyContext(EGL10 egl, EGLDisplay display, */ public interface EGLWindowSurfaceFactory { /** + * Create window surface egl surface. + * + * @param egl the egl + * @param display the display + * @param config the config + * @param nativeWindow the native window * @return null if the surface cannot be constructed. */ EGLSurface createWindowSurface(EGL10 egl, EGLDisplay display, EGLConfig config, Object nativeWindow); + /** + * Destroy surface. + * + * @param egl the egl + * @param display the display + * @param surface the surface + */ void destroySurface(EGL10 egl, EGLDisplay display, EGLSurface surface); } - private static class DefaultWindowSurfaceFactory implements GLTextureView.EGLWindowSurfaceFactory { + private static final class DefaultWindowSurfaceFactory implements GLTextureView.EGLWindowSurfaceFactory { @Override public EGLSurface createWindowSurface(EGL10 egl, EGLDisplay display, @@ -662,7 +716,12 @@ public interface EGLConfigChooser { } private abstract class BaseConfigChooser implements GLTextureView.EGLConfigChooser { - public BaseConfigChooser(int[] configSpec) { + /** + * Instantiates a new Base config chooser. + * + * @param configSpec the config spec + */ + private BaseConfigChooser(int[] configSpec) { mConfigSpec = filterConfigSpec(configSpec); } @@ -693,9 +752,20 @@ public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display) { return config; } + /** + * Choose config egl config. + * + * @param egl the egl + * @param display the display + * @param configs the configs + * @return the egl config + */ abstract EGLConfig chooseConfig(EGL10 egl, EGLDisplay display, EGLConfig[] configs); + /** + * The M config spec. + */ protected int[] mConfigSpec; private int[] filterConfigSpec(int[] configSpec) { @@ -724,8 +794,8 @@ private int[] filterConfigSpec(int[] configSpec) { * and at least the specified depth and stencil sizes. */ private class ComponentSizeChooser extends BaseConfigChooser { - public ComponentSizeChooser(int redSize, int greenSize, int blueSize, - int alphaSize, int depthSize, int stencilSize) { + private ComponentSizeChooser(int redSize, int greenSize, int blueSize, + int alphaSize, int depthSize, int stencilSize) { super(new int[]{ EGL10.EGL_RED_SIZE, redSize, EGL10.EGL_GREEN_SIZE, greenSize, @@ -751,7 +821,7 @@ public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display, EGL10.EGL_DEPTH_SIZE, 0); int s = findConfigAttrib(egl, display, config, EGL10.EGL_STENCIL_SIZE, 0); - if ((d >= mDepthSize) && (s >= mStencilSize)) { + if (d >= mDepthSize && s >= mStencilSize) { int r = findConfigAttrib(egl, display, config, EGL10.EGL_RED_SIZE, 0); int g = findConfigAttrib(egl, display, config, @@ -760,8 +830,8 @@ public EGLConfig chooseConfig(EGL10 egl, EGLDisplay display, EGL10.EGL_BLUE_SIZE, 0); int a = findConfigAttrib(egl, display, config, EGL10.EGL_ALPHA_SIZE, 0); - if ((r == mRedSize) && (g == mGreenSize) - && (b == mBlueSize) && (a == mAlphaSize)) { + if (r == mRedSize && g == mGreenSize + && b == mBlueSize && a == mAlphaSize) { return config; } } @@ -779,12 +849,30 @@ private int findConfigAttrib(EGL10 egl, EGLDisplay display, } private int[] mValue; - // Subclasses can adjust these values: + /** + * Subclasses can adjust these values: + * The M red size. + */ protected int mRedSize; + /** + * The M green size. + */ protected int mGreenSize; + /** + * The M blue size. + */ protected int mBlueSize; + /** + * The M alpha size. + */ protected int mAlphaSize; + /** + * The M depth size. + */ protected int mDepthSize; + /** + * The M stencil size. + */ protected int mStencilSize; } @@ -792,8 +880,13 @@ private int findConfigAttrib(EGL10 egl, EGLDisplay display, * This class will choose a RGB_888 surface with * or without a depth buffer. */ - private class SimpleEGLConfigChooser extends ComponentSizeChooser { - public SimpleEGLConfigChooser(boolean withDepthBuffer) { + private final class SimpleEGLConfigChooser extends ComponentSizeChooser { + /** + * Instantiates a new Simple egl config chooser. + * + * @param withDepthBuffer the with depth buffer + */ + private SimpleEGLConfigChooser(boolean withDepthBuffer) { super(8, 8, 8, 0, withDepthBuffer ? 16 : 0, 0); } } @@ -802,8 +895,13 @@ public SimpleEGLConfigChooser(boolean withDepthBuffer) { * An EGL helper class. */ - private static class EglHelper { - public EglHelper(WeakReference glTextureViewWeakRef) { + private static final class EglHelper { + /** + * Instantiates a new Egl helper. + * + * @param glTextureViewWeakRef the gl texture view weak ref + */ + private EglHelper(WeakReference glTextureViewWeakRef) { mGLTextureViewWeakRef = glTextureViewWeakRef; } @@ -925,6 +1023,8 @@ public boolean createSurface() { /** * Create a GL object for the current EGL context. + * + * @return the gl */ GL createGL() { @@ -962,6 +1062,9 @@ public int swap() { return EGL10.EGL_SUCCESS; } + /** + * Destroy surface. + */ public void destroySurface() { if (LOG_EGL) { Log.w("EglHelper", "destroySurface() tid=" + Thread.currentThread().getId()); @@ -988,6 +1091,9 @@ private void destroySurfaceImp() { } } + /** + * Finish. + */ public void finish() { if (LOG_EGL) { Log.w("EglHelper", "finish() tid=" + Thread.currentThread().getId()); @@ -1006,6 +1112,12 @@ public void finish() { } } + /** + * Throw egl exception. + * + * @param function the function + * @param error the error + */ public static void throwEglException(String function, int error) { String message = formatEglError(function, error); if (LOG_THREADS) { @@ -1015,19 +1127,48 @@ public static void throwEglException(String function, int error) { throw new RuntimeException(message); } + /** + * Log egl error as warning. + * + * @param tag the tag + * @param function the function + * @param error the error + */ public static void logEglErrorAsWarning(String tag, String function, int error) { Log.w(tag, formatEglError(function, error)); } + /** + * Format egl error string. + * + * @param function the function + * @param error the error + * @return the string + */ public static String formatEglError(String function, int error) { return function + " failed: " + LogWriter.getErrorString(error); } private WeakReference mGLTextureViewWeakRef; + /** + * The M egl. + */ EGL10 mEgl; + /** + * The M egl display. + */ EGLDisplay mEglDisplay; + /** + * The M egl surface. + */ EGLSurface mEglSurface; + /** + * The M egl config. + */ EGLConfig mEglConfig; + /** + * The M egl context. + */ EGLContext mEglContext; } @@ -1041,6 +1182,11 @@ public static String formatEglError(String function, int error) { * sGLThreadManager object. This avoids multiple-lock ordering issues. */ static class GLThread extends Thread { + /** + * Instantiates a new Gl thread. + * + * @param glTextureViewWeakRef the gl texture view weak ref + */ GLThread(WeakReference glTextureViewWeakRef) { super(); mWidth = 0; @@ -1063,7 +1209,7 @@ public void run() { } catch (InterruptedException e) { // fall thru and exit normally } finally { - sGLThreadManager.threadExiting(this); + GLOBAL_GLTHREAD_MANAGER.threadExiting(this); } } @@ -1086,7 +1232,7 @@ private void stopEglContextLocked() { if (mHaveEglContext) { mEglHelper.finish(); mHaveEglContext = false; - sGLThreadManager.releaseEglContextLocked(this); + GLOBAL_GLTHREAD_MANAGER.releaseEglContextLocked(this); } } @@ -1112,7 +1258,7 @@ private void guardedRun() throws InterruptedException { Runnable finishDrawingRunnable = null; while (true) { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { while (true) { if (mShouldExit) { return; @@ -1128,7 +1274,7 @@ private void guardedRun() throws InterruptedException { if (mPaused != mRequestPaused) { pausing = mRequestPaused; mPaused = mRequestPaused; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); if (LOG_PAUSE_RESUME) { Log.i("GLThread", "mPaused is now " + mPaused + " tid=" + getId()); } @@ -1173,7 +1319,7 @@ private void guardedRun() throws InterruptedException { } // Have we lost the SurfaceView surface? - if ((!mHasSurface) && (!mWaitingForSurface)) { + if (!mHasSurface && !mWaitingForSurface) { if (LOG_SURFACE) { Log.i("GLThread", "noticed surfaceView surface lost tid=" + getId()); } @@ -1182,7 +1328,7 @@ private void guardedRun() throws InterruptedException { } mWaitingForSurface = true; mSurfaceIsBad = false; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } // Have we acquired the surface view surface? @@ -1191,7 +1337,7 @@ private void guardedRun() throws InterruptedException { Log.i("GLThread", "noticed surfaceView surface acquired tid=" + getId()); } mWaitingForSurface = false; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } if (doRenderNotification) { @@ -1201,7 +1347,7 @@ private void guardedRun() throws InterruptedException { mWantRenderNotification = false; doRenderNotification = false; mRenderComplete = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } if (mFinishDrawingRunnable != null) { @@ -1220,13 +1366,13 @@ private void guardedRun() throws InterruptedException { try { mEglHelper.start(); } catch (RuntimeException t) { - sGLThreadManager.releaseEglContextLocked(this); + GLOBAL_GLTHREAD_MANAGER.releaseEglContextLocked(this); throw t; } mHaveEglContext = true; createEglContext = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } } @@ -1255,7 +1401,7 @@ private void guardedRun() throws InterruptedException { mSizeChanged = false; } mRequestRender = false; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); if (mWantRenderNotification) { wantRenderNotification = true; } @@ -1263,8 +1409,8 @@ private void guardedRun() throws InterruptedException { } } else { if (finishDrawingRunnable != null) { - Log.w(TAG, "Warning, !readyToDraw() but waiting for " + - "draw finished! Early reporting draw finished."); + Log.w(TAG, "Warning, !readyToDraw() but waiting for " + + "draw finished! Early reporting draw finished."); finishDrawingRunnable.run(); finishDrawingRunnable = null; } @@ -1284,7 +1430,7 @@ private void guardedRun() throws InterruptedException { + " mRequestRender: " + mRequestRender + " mRenderMode: " + mRenderMode); } - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } } // end of synchronized(sGLThreadManager) @@ -1299,15 +1445,15 @@ private void guardedRun() throws InterruptedException { Log.w("GLThread", "egl createSurface"); } if (mEglHelper.createSurface()) { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mFinishedCreatingEglSurface = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } } else { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mFinishedCreatingEglSurface = true; mSurfaceIsBad = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } continue; } @@ -1345,14 +1491,12 @@ private void guardedRun() throws InterruptedException { if (LOG_RENDERER_DRAW_FRAME) { Log.w("GLThread", "onDrawFrame tid=" + getId()); } - { - GLTextureView view = mGLTextureViewWeakRef.get(); - if (view != null) { - view.mRenderer.onDrawFrame(gl); - if (finishDrawingRunnable != null) { - finishDrawingRunnable.run(); - finishDrawingRunnable = null; - } + GLTextureView view = mGLTextureViewWeakRef.get(); + if (view != null) { + view.mRenderer.onDrawFrame(gl); + if (finishDrawingRunnable != null) { + finishDrawingRunnable.run(); + finishDrawingRunnable = null; } } int swapError = mEglHelper.swap(); @@ -1372,9 +1516,9 @@ private void guardedRun() throws InterruptedException { // Log the error to help developers understand why rendering stopped. EglHelper.logEglErrorAsWarning("GLThread", "eglSwapBuffers", swapError); - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mSurfaceIsBad = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } break; } @@ -1389,48 +1533,80 @@ private void guardedRun() throws InterruptedException { /* * clean-up everything... */ - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { stopEglSurfaceLocked(); stopEglContextLocked(); } } } + /** + * Able to draw boolean. + * + * @return the boolean + */ public boolean ableToDraw() { return mHaveEglContext && mHaveEglSurface && readyToDraw(); } + /** + * Ready to draw boolean. + * + * @return the boolean + */ public boolean readyToDraw() { - return (!mPaused) && mHasSurface && (!mSurfaceIsBad) - && (mWidth > 0) && (mHeight > 0) - && (mRequestRender || (mRenderMode == RENDERMODE_CONTINUOUSLY)); + return !mPaused + && mHasSurface + && !mSurfaceIsBad + && mWidth > 0 + && mHeight > 0 + && mRequestRender + || mRenderMode == RENDERMODE_CONTINUOUSLY; } + /** + * Sets render mode. + * + * @param renderMode the render mode + */ public void setRenderMode(int renderMode) { - if (!((RENDERMODE_WHEN_DIRTY <= renderMode) && (renderMode <= RENDERMODE_CONTINUOUSLY))) { + if (!(RENDERMODE_WHEN_DIRTY <= renderMode && renderMode <= RENDERMODE_CONTINUOUSLY)) { throw new IllegalArgumentException("renderMode"); } - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mRenderMode = renderMode; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } } + /** + * Gets render mode. + * + * @return the render mode + */ public int getRenderMode() { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { return mRenderMode; } } + /** + * Request render. + */ public void requestRender() { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mRequestRender = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } } + /** + * Request render and notify. + * + * @param finishDrawing the finish drawing + */ public void requestRenderAndNotify(Runnable finishDrawing) { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { // If we are already on the GL thread, this means a client callback // has caused reentrancy, for example via updating the SurfaceView parameters. // We will return to the client rendering code, so here we don't need to @@ -1444,23 +1620,26 @@ public void requestRenderAndNotify(Runnable finishDrawing) { mRenderComplete = false; mFinishDrawingRunnable = finishDrawing; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } } + /** + * Surface created. + */ public void surfaceCreated() { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { if (LOG_THREADS) { Log.i("GLThread", "surfaceCreated tid=" + getId()); } mHasSurface = true; mFinishedCreatingEglSurface = false; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); while (mWaitingForSurface && !mFinishedCreatingEglSurface && !mExited) { try { - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } @@ -1468,16 +1647,19 @@ public void surfaceCreated() { } } + /** + * Surface destroyed. + */ public void surfaceDestroyed() { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { if (LOG_THREADS) { Log.i("GLThread", "surfaceDestroyed tid=" + getId()); } mHasSurface = false; - sGLThreadManager.notifyAll(); - while ((!mWaitingForSurface) && (!mExited)) { + GLOBAL_GLTHREAD_MANAGER.notifyAll(); + while (!mWaitingForSurface && !mExited) { try { - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } catch (InterruptedException e) { Thread.currentThread().interrupt(); } @@ -1485,19 +1667,22 @@ public void surfaceDestroyed() { } } + /** + * On pause. + */ public void onPause() { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { if (LOG_PAUSE_RESUME) { Log.i("GLThread", "onPause tid=" + getId()); } mRequestPaused = true; - sGLThreadManager.notifyAll(); - while ((!mExited) && (!mPaused)) { + GLOBAL_GLTHREAD_MANAGER.notifyAll(); + while (!mExited && !mPaused) { if (LOG_PAUSE_RESUME) { Log.i("Main thread", "onPause waiting for mPaused."); } try { - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } catch (InterruptedException ex) { Thread.currentThread().interrupt(); } @@ -1505,21 +1690,24 @@ public void onPause() { } } + /** + * On resume. + */ public void onResume() { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { if (LOG_PAUSE_RESUME) { Log.i("GLThread", "onResume tid=" + getId()); } mRequestPaused = false; mRequestRender = true; mRenderComplete = false; - sGLThreadManager.notifyAll(); - while ((!mExited) && mPaused && (!mRenderComplete)) { + GLOBAL_GLTHREAD_MANAGER.notifyAll(); + while (!mExited && mPaused && !mRenderComplete) { if (LOG_PAUSE_RESUME) { Log.i("Main thread", "onResume waiting for !mPaused."); } try { - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } catch (InterruptedException ex) { Thread.currentThread().interrupt(); } @@ -1527,8 +1715,14 @@ public void onResume() { } } + /** + * On window resize. + * + * @param w the w + * @param h the h + */ public void onWindowResize(int w, int h) { - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mWidth = w; mHeight = h; mSizeChanged = true; @@ -1544,7 +1738,7 @@ public void onWindowResize(int w, int h) { return; } - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); // Wait for thread to react to resize and render a frame while (!mExited && !mPaused && !mRenderComplete @@ -1553,7 +1747,7 @@ && ableToDraw()) { Log.i("Main thread", "onWindowResize waiting for render complete from tid=" + getId()); } try { - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } catch (InterruptedException ex) { Thread.currentThread().interrupt(); } @@ -1561,15 +1755,18 @@ && ableToDraw()) { } } + /** + * Request exit and wait. + */ public void requestExitAndWait() { // don't call this from GLThread thread or it is a guaranteed // deadlock! - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mShouldExit = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); while (!mExited) { try { - sGLThreadManager.wait(); + GLOBAL_GLTHREAD_MANAGER.wait(); } catch (InterruptedException ex) { Thread.currentThread().interrupt(); } @@ -1577,9 +1774,12 @@ public void requestExitAndWait() { } } + /** + * Request release egl context locked. + */ public void requestReleaseEglContextLocked() { mShouldReleaseEglContext = true; - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } /** @@ -1591,12 +1791,17 @@ public void queueEvent(Runnable r) { if (r == null) { throw new IllegalArgumentException("r must not be null"); } - synchronized (sGLThreadManager) { + synchronized (GLOBAL_GLTHREAD_MANAGER) { mEventQueue.add(r); - sGLThreadManager.notifyAll(); + GLOBAL_GLTHREAD_MANAGER.notifyAll(); } } + /** + * Gets egl context. + * + * @return the egl context + */ public EGLContext getEglContext() { if (mEglHelper.mEglContext == null || EGL10.EGL_NO_CONTEXT == mEglHelper.mEglContext) { Log.i("GLThread", "getEglContext mEglContext is invalid."); @@ -1641,6 +1846,9 @@ public EGLContext getEglContext() { } + /** + * The type Log writer. + */ static class LogWriter extends Writer { @Override @@ -1672,6 +1880,12 @@ private void flushBuilder() { } } + /** + * Gets error string. + * + * @param error the error + * @return the error string + */ public static String getErrorString(int error) { switch (error) { case EGL_SUCCESS: @@ -1723,9 +1937,13 @@ private void checkRenderThreadState() { } } - private static class GLThreadManager { - private static String TAG = "GLThreadManager"; + private static final class GLThreadManager { + /** + * Thread exiting. + * + * @param thread the thread + */ public synchronized void threadExiting(GLThread thread) { if (LOG_THREADS) { Log.i("GLThread", "exiting tid=" + thread.getId()); @@ -1734,6 +1952,11 @@ public synchronized void threadExiting(GLThread thread) { notifyAll(); } + /** + * Release egl context locked. + * + * @param thread the thread + */ /* * Releases the EGL context. Requires that we are already in the * sGLThreadManager monitor when this is called. @@ -1743,7 +1966,7 @@ public void releaseEglContextLocked(GLThread thread) { } } - private static final GLThreadManager sGLThreadManager = new GLThreadManager(); + private static final GLThreadManager GLOBAL_GLTHREAD_MANAGER = new GLThreadManager(); private final WeakReference mThisWeakRef = diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvFboProgram.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvFboProgram.java index 9c84356c7..90c911f6b 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvFboProgram.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvFboProgram.java @@ -8,6 +8,9 @@ import io.agora.base.internal.video.GlUtil; import io.agora.base.internal.video.RendererCommon; +/** + * The type Yuv fbo program. + */ public class YuvFboProgram { private int[] mFboTextureId; @@ -17,13 +20,19 @@ public class YuvFboProgram { private int mWidth, mHeight; private volatile boolean isRelease; - // GL Thread + /** + * Instantiates a new Yuv fbo program. + */ +// GL Thread public YuvFboProgram() { yuvUploader = new YuvUploader(); glRectDrawer = new GlRectDrawer(); } - // GL Thread + /** + * Release. + */ +// GL Thread public void release() { isRelease = true; if (mFboTextureId != null) { @@ -35,7 +44,15 @@ public void release() { } } - // GL Thread + /** + * Draw yuv integer. + * + * @param yuv the yuv + * @param width the width + * @param height the height + * @return the integer + */ +// GL Thread public Integer drawYuv(byte[] yuv, int width, int height) { if (isRelease) { return -1; @@ -77,7 +94,7 @@ public Integer drawYuv(byte[] yuv, int width, int height) { yuvUploader.uploadFromBuffer(i420Buffer); Matrix matrix = new Matrix(); matrix.preTranslate(0.5f, 0.5f); - matrix.preScale(1f, -1f);// I420-frames are upside down + matrix.preScale(1f, -1f); // I420-frames are upside down matrix.preTranslate(-0.5f, -0.5f); glRectDrawer.drawYuv(yuvUploader.getYuvTextures(), RendererCommon.convertMatrixFromAndroidGraphicsMatrix(matrix), width, height, 0, 0, width, height); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvUploader.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvUploader.java index 247365045..8c2a1eae1 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvUploader.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/videoRender/YuvUploader.java @@ -10,6 +10,9 @@ import io.agora.base.internal.video.GlUtil; import io.agora.base.internal.video.YuvHelper; +/** + * The type Yuv uploader. + */ public class YuvUploader { // Intermediate copy buffer for uploading yuv frames that are not packed, i.e. stride > width. // TODO(magjed): Investigate when GL_UNPACK_ROW_LENGTH is available, or make a custom shader @@ -20,6 +23,10 @@ public class YuvUploader { /** * Upload |planes| into OpenGL textures, taking stride into consideration. * + * @param width the width + * @param height the height + * @param strides the strides + * @param planes the planes * @return Array of three texture indices corresponding to Y-, U-, and V-plane respectively. */ @Nullable @@ -65,6 +72,12 @@ public int[] uploadYuvData(int width, int height, int[] strides, ByteBuffer[] pl return yuvTextures; } + /** + * Upload from buffer int [ ]. + * + * @param buffer the buffer + * @return the int [ ] + */ @Nullable public int[] uploadFromBuffer(VideoFrame.I420Buffer buffer) { int[] strides = {buffer.getStrideY(), buffer.getStrideU(), buffer.getStrideV()}; @@ -72,6 +85,11 @@ public int[] uploadFromBuffer(VideoFrame.I420Buffer buffer) { return uploadYuvData(buffer.getWidth(), buffer.getHeight(), strides, planes); } + /** + * Get yuv textures int [ ]. + * + * @return the int [ ] + */ @Nullable public int[] getYuvTextures() { return yuvTextures; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayer.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayer.java index 1231c0ed1..66a693653 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayer.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayer.java @@ -17,6 +17,9 @@ import io.agora.api.example.common.BaseFragment; import io.agora.api.example.databinding.FragmentAudiorouterPlayerBinding; +/** + * The type Audio router player. + */ @Example( index = 17, group = ADVANCED, @@ -41,7 +44,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat mBinding.btnJoin.setOnClickListener(v -> { String channelId = mBinding.etChannel.getText().toString(); - if(TextUtils.isEmpty(channelId)){ + if (TextUtils.isEmpty(channelId)) { showAlert("Please enter channel id firstly!"); return; } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerExo.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerExo.java index 1c636bc37..367b01c4e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerExo.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerExo.java @@ -39,7 +39,10 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; -public class AudioRouterPlayerExo extends BaseFragment{ +/** + * The type Audio router player exo. + */ +public class AudioRouterPlayerExo extends BaseFragment { private static final String TAG = "AudioRouterPlayerExo"; private FragmentAudiorouterPlayerDetailBinding mBinding; private RtcEngine mRtcEngine; @@ -51,14 +54,12 @@ public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } String channelId = requireArguments().getString("channelId"); - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); /* * The context of Android Activity @@ -81,7 +82,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)requireActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getAreaCode(); mRtcEngine = RtcEngine.create(config); /* * This parameter is for reporting the usages of APIExample to agora background. @@ -110,8 +111,8 @@ public void onCreate(@Nullable Bundle savedInstanceState) { mRtcEngine.enableVideo(); // Setup video encoding configs mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE )); @@ -132,8 +133,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = mRtcEngine.joinChannel(ret, channelId, uid, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -141,9 +141,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { showAlert(RtcEngine.getErrorDescription(Math.abs(res))); } }); - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); requireActivity().onBackPressed(); } @@ -164,8 +162,7 @@ public void onEvents(Player player, Player.Events events) { @Override public void onDestroy() { super.onDestroy(); - if(mRtcEngine != null) - { + if (mRtcEngine != null) { mRtcEngine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -208,8 +205,7 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: ... @@ -221,7 +217,7 @@ public void onError(int err) { showLongToast("Error code:" + err + ", msg:" + RtcEngine.getErrorDescription(err)); if (Constants.ERR_INVALID_TOKEN == err) { showAlert(getString(R.string.token_invalid)); - } if (Constants.ERR_TOKEN_EXPIRED == err) { + } else if (Constants.ERR_TOKEN_EXPIRED == err) { showAlert(getString(R.string.token_expired)); } } @@ -233,8 +229,7 @@ public void onError(int err) { * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format(Locale.US, "onJoinChannelSuccess channel %s uid %d", channel, uid)); runOnUIThread(() -> mBinding.localVideo.setReportUid(uid)); @@ -245,8 +240,7 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format(Locale.US, "user %d joined!", uid)); @@ -255,8 +249,7 @@ public void onUserJoined(int uid, int elapsed) if (context == null) { return; } - runOnUIThread(() -> - { + runOnUIThread(() -> { /*Display remote video stream*/ // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); @@ -282,8 +275,7 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format(Locale.US, "user %d offline! reason:%d", uid, reason)); runOnUIThread(() -> { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerIjk.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerIjk.java index 4afaef2db..808ec3cf9 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerIjk.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerIjk.java @@ -37,7 +37,10 @@ import io.agora.rtc2.video.VideoEncoderConfiguration; import tv.danmaku.ijk.media.player.IjkMediaPlayer; -public class AudioRouterPlayerIjk extends BaseFragment{ +/** + * The type Audio router player ijk. + */ +public class AudioRouterPlayerIjk extends BaseFragment { private static final String TAG = "AudioRouterPlayerIjk"; private FragmentAudiorouterPlayerDetailBinding mBinding; private RtcEngine mRtcEngine; @@ -49,14 +52,12 @@ public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } String channelId = requireArguments().getString("channelId"); - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); /* * The context of Android Activity @@ -79,7 +80,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)requireActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getAreaCode(); mRtcEngine = RtcEngine.create(config); /* * This parameter is for reporting the usages of APIExample to agora background. @@ -107,8 +108,8 @@ public void onCreate(@Nullable Bundle savedInstanceState) { mRtcEngine.enableVideo(); // Setup video encoding configs mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE )); @@ -129,8 +130,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = mRtcEngine.joinChannel(ret, channelId, uid, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -138,9 +138,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { showAlert(RtcEngine.getErrorDescription(Math.abs(res))); } }); - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); requireActivity().onBackPressed(); } @@ -159,8 +157,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - if(mRtcEngine != null) - { + if (mRtcEngine != null) { mRtcEngine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -219,8 +216,7 @@ public void surfaceDestroyed(@NonNull SurfaceHolder holder) { * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: ... @@ -232,7 +228,7 @@ public void onError(int err) { showLongToast("Error code:" + err + ", msg:" + RtcEngine.getErrorDescription(err)); if (Constants.ERR_INVALID_TOKEN == err) { showAlert(getString(R.string.token_invalid)); - } if (Constants.ERR_TOKEN_EXPIRED == err) { + } else if (Constants.ERR_TOKEN_EXPIRED == err) { showAlert(getString(R.string.token_expired)); } } @@ -244,8 +240,7 @@ public void onError(int err) { * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format(Locale.US, "onJoinChannelSuccess channel %s uid %d", channel, uid)); runOnUIThread(() -> mBinding.localVideo.setReportUid(uid)); @@ -256,8 +251,7 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format(Locale.US, "user %d joined!", uid)); @@ -266,8 +260,7 @@ public void onUserJoined(int uid, int elapsed) if (context == null) { return; } - runOnUIThread(() -> - { + runOnUIThread(() -> { /*Display remote video stream*/ // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); @@ -293,8 +286,7 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format(Locale.US, "user %d offline! reason:%d", uid, reason)); runOnUIThread(() -> { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerNative.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerNative.java index b295caf24..fa3e86c44 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerNative.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioRouterPlayerNative.java @@ -38,7 +38,10 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; -public class AudioRouterPlayerNative extends BaseFragment{ +/** + * The type Audio router player native. + */ +public class AudioRouterPlayerNative extends BaseFragment { private static final String TAG = "AudioRouterPlayerNative"; private FragmentAudiorouterPlayerDetailBinding mBinding; private RtcEngine mRtcEngine; @@ -50,14 +53,12 @@ public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } String channelId = requireArguments().getString("channelId"); - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); /* * The context of Android Activity @@ -80,7 +81,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)requireActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getAreaCode(); mRtcEngine = RtcEngine.create(config); /* * This parameter is for reporting the usages of APIExample to agora background. @@ -108,8 +109,8 @@ public void onCreate(@Nullable Bundle savedInstanceState) { mRtcEngine.enableVideo(); // Setup video encoding configs mRtcEngine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) requireActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE )); @@ -130,8 +131,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = mRtcEngine.joinChannel(ret, channelId, uid, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -139,9 +139,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { showAlert(RtcEngine.getErrorDescription(Math.abs(res))); } }); - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); requireActivity().onBackPressed(); } @@ -160,8 +158,7 @@ public void onCreate(@Nullable Bundle savedInstanceState) { @Override public void onDestroy() { super.onDestroy(); - if(mRtcEngine != null) - { + if (mRtcEngine != null) { mRtcEngine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -219,8 +216,7 @@ public void surfaceDestroyed(@NonNull SurfaceHolder holder) { * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: * en: ... @@ -232,7 +228,7 @@ public void onError(int err) { showLongToast("Error code:" + err + ", msg:" + RtcEngine.getErrorDescription(err)); if (Constants.ERR_INVALID_TOKEN == err) { showAlert(getString(R.string.token_invalid)); - } if (Constants.ERR_TOKEN_EXPIRED == err) { + } else { showAlert(getString(R.string.token_expired)); } } @@ -244,8 +240,7 @@ public void onError(int err) { * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format(Locale.US, "onJoinChannelSuccess channel %s uid %d", channel, uid)); runOnUIThread(() -> mBinding.localVideo.setReportUid(uid)); @@ -256,8 +251,7 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format(Locale.US, "user %d joined!", uid)); @@ -266,8 +260,7 @@ public void onUserJoined(int uid, int elapsed) if (context == null) { return; } - runOnUIThread(() -> - { + runOnUIThread(() -> { /*Display remote video stream*/ // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); @@ -293,8 +286,7 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format(Locale.US, "user %d offline! reason:%d", uid, reason)); runOnUIThread(() -> { diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioWaveform.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioWaveform.java index d152c8dbf..109b634d0 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioWaveform.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/audio/AudioWaveform.java @@ -28,6 +28,9 @@ import io.agora.rtc2.RtcEngineConfig; import io.agora.rtc2.proxy.LocalAccessPointConfiguration; +/** + * The type Audio waveform. + */ @Example( index = 17, group = ADVANCED, diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java index 89ca1b59f..618d18362 100755 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelAudio.java @@ -2,9 +2,22 @@ import static io.agora.api.example.common.model.Examples.BASIC; +import android.app.Notification; +import android.app.NotificationChannel; +import android.app.NotificationManager; +import android.app.PendingIntent; +import android.app.Service; import android.content.Context; +import android.content.Intent; +import android.content.pm.ApplicationInfo; +import android.content.pm.ServiceInfo; +import android.graphics.Bitmap; +import android.graphics.BitmapFactory; +import android.os.Build; import android.os.Bundle; import android.os.Handler; +import android.os.IBinder; +import android.provider.Settings; import android.util.Log; import android.view.LayoutInflater; import android.view.View; @@ -18,13 +31,17 @@ import androidx.annotation.NonNull; import androidx.annotation.Nullable; +import androidx.appcompat.app.AlertDialog; +import androidx.core.app.NotificationManagerCompat; import com.yanzhenjie.permission.AndPermission; import com.yanzhenjie.permission.runtime.Permission; +import java.util.ArrayList; import java.util.LinkedHashMap; import java.util.Map; +import io.agora.api.example.MainActivity; import io.agora.api.example.MainApplication; import io.agora.api.example.R; import io.agora.api.example.annotation.Example; @@ -133,11 +150,11 @@ public void onNothingSelected(AdapterView parent) { audioRouteInput.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() { @Override public void onItemSelected(AdapterView parent, View view, int position, long id) { - if(!joined){ + if (!joined) { return; } - boolean isChatRoomMode = "CHATROOM".equals(audioScenarioInput.getSelectedItem()); - if (isChatRoomMode) { + boolean isCommunication = getString(R.string.channel_profile_communication).equals(channelProfileInput.getSelectedItem()); + if (isCommunication) { int route = Constants.AUDIO_ROUTE_EARPIECE; if (getString(R.string.audio_route_earpiece).equals(parent.getSelectedItem())) { route = Constants.AUDIO_ROUTE_EARPIECE; @@ -146,9 +163,7 @@ public void onItemSelected(AdapterView parent, View view, int position, long } else if (getString(R.string.audio_route_headset).equals(parent.getSelectedItem())) { route = Constants.AUDIO_ROUTE_HEADSET; } else if (getString(R.string.audio_route_headset_bluetooth).equals(parent.getSelectedItem())) { - route = Constants.AUDIO_ROUTE_HEADSETBLUETOOTH; - } else if (getString(R.string.audio_route_headset_typec).equals(parent.getSelectedItem())) { - route = Constants.AUDIO_ROUTE_USBDEVICE; + route = Constants.AUDIO_ROUTE_BLUETOOTH_DEVICE_HFP; } int ret = engine.setRouteInCommunicationMode(route); showShortToast("setRouteInCommunicationMode route=" + route + ", ret=" + ret); @@ -162,7 +177,6 @@ public void onItemSelected(AdapterView parent, View view, int position, long int ret = engine.setEnableSpeakerphone(isSpeakerPhone); showShortToast("setEnableSpeakerphone enable=" + isSpeakerPhone + ", ret=" + ret); } - } @Override @@ -197,6 +211,26 @@ record = view.findViewById(R.id.recordingVol); view.findViewById(R.id.audio_place_05), view.findViewById(R.id.audio_place_06) ); + + if (savedInstanceState != null) { + joined = savedInstanceState.getBoolean("joined"); + if (joined) { + myUid = savedInstanceState.getInt("myUid"); + ArrayList seatRemoteUidList = savedInstanceState.getIntegerArrayList("seatRemoteUidList"); + mute.setEnabled(true); + join.setEnabled(true); + join.setText(getString(R.string.leave)); + record.setEnabled(true); + playout.setEnabled(true); + inear.setEnabled(inEarSwitch.isChecked()); + inEarSwitch.setEnabled(true); + audioSeatManager.upLocalSeat(myUid); + + for (Integer uid : seatRemoteUidList) { + audioSeatManager.upRemoteSeat(uid); + } + } + } } @Override @@ -204,27 +238,27 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) { + if (context == null || engine != null) { return; } try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ @@ -232,7 +266,7 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.valueOf(audioScenarioInput.getSelectedItem().toString())); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -254,17 +288,81 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) { e.printStackTrace(); getActivity().onBackPressed(); } + enableNotifications(); + } + + private void enableNotifications() { + if (NotificationManagerCompat.from(requireContext()).areNotificationsEnabled()) { + Log.d(TAG, "Notifications enable!"); + return; + } + Log.d(TAG, "Notifications not enable!"); + new AlertDialog.Builder(requireContext()) + .setTitle("Tip") + .setMessage(R.string.notifications_enable_tip) + .setPositiveButton(R.string.setting, (dialog, which) -> { + Intent intent = new Intent(); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { + intent.setAction(Settings.ACTION_APP_NOTIFICATION_SETTINGS); + intent.putExtra(Settings.EXTRA_APP_PACKAGE, requireContext().getPackageName()); + intent.putExtra(Settings.EXTRA_CHANNEL_ID, requireContext().getApplicationInfo().uid); + } else { + intent.setAction(Settings.ACTION_APPLICATION_DETAILS_SETTINGS); + } + startActivity(intent); + dialog.dismiss(); + }) + .show(); } @Override - public void onDestroy() { - super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ + public void onPause() { + super.onPause(); + startRecordingService(); + } + + private void startRecordingService() { + if (joined) { + Intent intent = new Intent(requireContext(), LocalRecordingService.class); + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) { + requireContext().startForegroundService(intent); + } else { + requireContext().startService(intent); + } + } + } + + @Override + public void onSaveInstanceState(@NonNull Bundle outState) { + super.onSaveInstanceState(outState); + // join state + outState.putBoolean("joined", joined); + outState.putInt("myUid", myUid); + outState.putIntegerArrayList("seatRemoteUidList", audioSeatManager.getSeatRemoteUidList()); + } + + @Override + public void onResume() { + super.onResume(); + stopRecordingService(); + } + + private void stopRecordingService() { + Intent intent = new Intent(requireContext(), LocalRecordingService.class); + requireContext().stopService(intent); + } + + @Override + protected void onBackPressed() { + joined = false; + stopRecordingService(); + /*leaveChannel and Destroy the RtcEngine instance*/ if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); engine = null; + super.onBackPressed(); } @Override @@ -285,8 +383,7 @@ public void onClick(View v) { AndPermission.with(this).runtime().permission( Permission.Group.STORAGE, Permission.Group.MICROPHONE - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); audioProfileInput.setEnabled(false); @@ -294,7 +391,7 @@ public void onClick(View v) { }).start(); } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -328,7 +425,7 @@ public void onClick(View v) { } else if (v.getId() == R.id.microphone) { mute.setActivated(!mute.isActivated()); mute.setText(getString(mute.isActivated() ? R.string.openmicrophone : R.string.closemicrophone)); - /**Turn off / on the microphone, stop / start local audio collection and push streaming.*/ + /*Turn off / on the microphone, stop / start local audio collection and push streaming.*/ engine.muteLocalAudioStream(mute.isActivated()); } } @@ -338,7 +435,7 @@ public void onClick(View v) { * Users that input the same channel name join the same channel. */ private void joinChannel(String channelId) { - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); int channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; @@ -364,14 +461,14 @@ private void joinChannel(String channelId) { option.autoSubscribeAudio = true; option.autoSubscribeVideo = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see * https://docs.agora.io/en/cloud-recording/token_server_java?platform=Java*/ TokenUtils.gen(requireContext(), channelId, 0, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, 0, option); if (res != 0) { @@ -426,18 +523,15 @@ public void onJoinChannelSuccess(String channel, int uid, int elapsed) { showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() { - @Override - public void run() { - mute.setEnabled(true); - join.setEnabled(true); - join.setText(getString(R.string.leave)); - record.setEnabled(true); - playout.setEnabled(true); - inear.setEnabled(inEarSwitch.isChecked()); - inEarSwitch.setEnabled(true); - audioSeatManager.upLocalSeat(uid); - } + runOnUIThread(() -> { + mute.setEnabled(true); + join.setEnabled(true); + join.setText(getString(R.string.leave)); + record.setEnabled(true); + playout.setEnabled(true); + inear.setEnabled(inEarSwitch.isChecked()); + inEarSwitch.setEnabled(true); + audioSeatManager.upLocalSeat(uid); }); } @@ -544,22 +638,22 @@ public void onAudioRouteChanged(int routing) { showShortToast("onAudioRouteChanged : " + routing); runOnUIThread(() -> { String selectedRouteStr = getString(R.string.audio_route_speakerphone); - if(routing == Constants.AUDIO_ROUTE_EARPIECE){ + if (routing == Constants.AUDIO_ROUTE_EARPIECE) { selectedRouteStr = getString(R.string.audio_route_earpiece); - }else if(routing == Constants.AUDIO_ROUTE_SPEAKERPHONE){ + } else if (routing == Constants.AUDIO_ROUTE_SPEAKERPHONE) { selectedRouteStr = getString(R.string.audio_route_speakerphone); - }else if(routing == Constants.AUDIO_ROUTE_HEADSET){ + } else if (routing == Constants.AUDIO_ROUTE_HEADSET) { selectedRouteStr = getString(R.string.audio_route_headset); - }else if(routing == Constants.AUDIO_ROUTE_HEADSETBLUETOOTH){ + } else if (routing == Constants.AUDIO_ROUTE_BLUETOOTH_DEVICE_HFP) { selectedRouteStr = getString(R.string.audio_route_headset_bluetooth); - }else if(routing == Constants.AUDIO_ROUTE_USBDEVICE){ + } else if (routing == Constants.AUDIO_ROUTE_USBDEVICE) { selectedRouteStr = getString(R.string.audio_route_headset_typec); } int selection = 0; for (int i = 0; i < audioRouteInput.getAdapter().getCount(); i++) { String routeStr = (String) audioRouteInput.getItemAtPosition(i); - if(routeStr.equals(selectedRouteStr)){ + if (routeStr.equals(selectedRouteStr)) { selection = i; break; } @@ -568,4 +662,86 @@ public void onAudioRouteChanged(int routing) { }); } }; + + + /** + * The service will display a microphone foreground notification, + * which can ensure keeping recording when the activity destroyed by system for memory leak or other reasons. + * Note: The "android.permission.FOREGROUND_SERVICE" permission is required. + * And the android:foregroundServiceType should be microphone. + */ + public static class LocalRecordingService extends Service { + private static final int NOTIFICATION_ID = 1234567800; + private static final String CHANNEL_ID = "audio_channel_id"; + + + @Override + public void onCreate() { + super.onCreate(); + Notification notification = getDefaultNotification(); + + try { + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.R) { + this.startForeground(NOTIFICATION_ID, notification, ServiceInfo.FOREGROUND_SERVICE_TYPE_MICROPHONE); + } else { + this.startForeground(NOTIFICATION_ID, notification); + } + } catch (Exception ex) { + Log.e(TAG, "", ex); + } + } + + @Nullable + @Override + public IBinder onBind(Intent intent) { + return null; + } + + private Notification getDefaultNotification() { + ApplicationInfo appInfo = this.getApplicationContext().getApplicationInfo(); + String name = this.getApplicationContext().getPackageManager().getApplicationLabel(appInfo).toString(); + int icon = appInfo.icon; + + try { + Bitmap iconBitMap = BitmapFactory.decodeResource(this.getApplicationContext().getResources(), icon); + if (iconBitMap == null || iconBitMap.getByteCount() == 0) { + Log.w(TAG, "Couldn't load icon from icon of applicationInfo, use android default"); + icon = R.mipmap.ic_launcher; + } + } catch (Exception ex) { + Log.w(TAG, "Couldn't load icon from icon of applicationInfo, use android default"); + icon = R.mipmap.ic_launcher; + } + + if (Build.VERSION.SDK_INT >= 26) { + NotificationChannel mChannel = new NotificationChannel(CHANNEL_ID, name, NotificationManager.IMPORTANCE_DEFAULT); + NotificationManager mNotificationManager = (NotificationManager) this.getSystemService(Context.NOTIFICATION_SERVICE); + mNotificationManager.createNotificationChannel(mChannel); + } + + PendingIntent activityPendingIntent; + Intent intent = new Intent(); + intent.setClass(this, MainActivity.class); + if (Build.VERSION.SDK_INT >= 23) { + activityPendingIntent = PendingIntent.getActivity(this, 0, intent, PendingIntent.FLAG_ONE_SHOT | PendingIntent.FLAG_IMMUTABLE); + } else { + activityPendingIntent = PendingIntent.getActivity(this, 0, intent, PendingIntent.FLAG_ONE_SHOT); + } + + Notification.Builder builder = new Notification.Builder(this) + .addAction(icon, "Back to app", activityPendingIntent) + .setContentText("Agora Recording ...") + .setOngoing(true) + .setPriority(Notification.PRIORITY_HIGH) + .setSmallIcon(icon) + .setTicker(name) + .setWhen(System.currentTimeMillis()); + if (Build.VERSION.SDK_INT >= 26) { + builder.setChannelId(CHANNEL_ID); + } + + return builder.build(); + } + + } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideo.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideo.java index 94d201179..f6e8e8ea2 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideo.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideo.java @@ -46,7 +46,9 @@ import io.agora.rtc2.video.VideoCanvas; import io.agora.rtc2.video.VideoEncoderConfiguration; -/**This demo demonstrates how to make a one-to-one video call*/ +/** + * This demo demonstrates how to make a one-to-one video call + */ @Example( index = 1, group = BASIC, @@ -54,8 +56,7 @@ actionId = R.id.action_mainFragment_to_joinChannelVideo, tipsId = R.string.joinchannelvideo ) -public class JoinChannelVideo extends BaseFragment implements View.OnClickListener -{ +public class JoinChannelVideo extends BaseFragment implements View.OnClickListener { private static final String TAG = JoinChannelVideo.class.getSimpleName(); private VideoReportLayout fl_local, fl_remote, fl_remote_2, fl_remote_3; @@ -69,15 +70,13 @@ public class JoinChannelVideo extends BaseFragment implements View.OnClickListen @Nullable @Override - public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) - { + public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { View view = inflater.inflate(R.layout.fragment_joinchannel_video, container, false); return view; } @Override - public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) - { + public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceState) { super.onViewCreated(view, savedInstanceState); join = view.findViewById(R.id.btn_join); switch_camera = view.findViewById(R.id.btn_switch_camera); @@ -91,42 +90,39 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat } @Override - public void onActivityCreated(@Nullable Bundle savedInstanceState) - { + public void onActivityCreated(@Nullable Bundle savedInstanceState) { super.onActivityCreated(savedInstanceState); // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } - try - { + try { RtcEngineConfig config = new RtcEngineConfig(); - /** + /* * The context of Android Activity */ config.mContext = context.getApplicationContext(); - /** + /* * The App ID issued to you by Agora. See How to get the App ID */ config.mAppId = getString(R.string.agora_app_id); - /** Sets the channel profile of the Agora RtcEngine. + /* Sets the channel profile of the Agora RtcEngine. CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. Use this profile in one-on-one calls or group calls, where all users can talk freely. CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; an audience can only receive streams.*/ config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** + /* * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); - config.mAreaCode = ((MainApplication)getActivity().getApplication()).getGlobalSettings().getAreaCode(); + config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** + /* * This parameter is for reporting the usages of APIExample to agora background. * Generally, it is not necessary for you to set this parameter. */ @@ -144,21 +140,17 @@ public void onActivityCreated(@Nullable Bundle savedInstanceState) // This api can only be used in the private media server scenario, otherwise some problems may occur. engine.setLocalAccessPoint(localAccessPointConfiguration); } - } - catch (Exception e) - { + } catch (Exception e) { e.printStackTrace(); getActivity().onBackPressed(); } } @Override - public void onDestroy() - { + public void onDestroy() { super.onDestroy(); - /**leaveChannel and Destroy the RtcEngine instance*/ - if(engine != null) - { + /*leaveChannel and Destroy the RtcEngine instance*/ + if (engine != null) { engine.leaveChannel(); } handler.post(RtcEngine::destroy); @@ -167,12 +159,9 @@ public void onDestroy() @SuppressLint("WrongConstant") @Override - public void onClick(View v) - { - if (v.getId() == R.id.btn_join) - { - if (!joined) - { + public void onClick(View v) { + if (v.getId() == R.id.btn_join) { + if (!joined) { CommonUtil.hideInputBoard(getActivity(), et_channel); // call when join button hit String channelId = et_channel.getText().toString(); @@ -182,31 +171,27 @@ public void onClick(View v) permissionList.add(Permission.WRITE_EXTERNAL_STORAGE); permissionList.add(Permission.RECORD_AUDIO); permissionList.add(Permission.CAMERA); - if(Build.VERSION.SDK_INT >= Build.VERSION_CODES.S){ + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.S) { permissionList.add(Manifest.permission.BLUETOOTH_CONNECT); } String[] permissionArray = new String[permissionList.size()]; permissionList.toArray(permissionArray); - if (AndPermission.hasPermissions(this,permissionArray)) - { + if (AndPermission.hasPermissions(this, permissionArray)) { joinChannel(channelId); return; } // Request permission AndPermission.with(this).runtime().permission( permissionArray - ).onGranted(permissions -> - { + ).onGranted(permissions -> { // Permissions Granted joinChannel(channelId); }).start(); - } - else - { + } else { joined = false; - /**After joining a channel, the user must call the leaveChannel method to end the + /*After joining a channel, the user must call the leaveChannel method to end the * call before joining another channel. This method returns 0 if the user leaves the * channel and releases all resources related to the call. This method call is * asynchronous, and the user has not exited the channel when the method call returns. @@ -230,26 +215,23 @@ public void onClick(View v) } remoteViews.clear(); } - }else if(v.getId() == switch_camera.getId()){ - if(engine != null && joined){ + } else if (v.getId() == switch_camera.getId()) { + if (engine != null && joined) { engine.switchCamera(); } } } - private void joinChannel(String channelId) - { + private void joinChannel(String channelId) { // Check if the context is valid Context context = getContext(); - if (context == null) - { + if (context == null) { return; } // Create render view by RtcEngine SurfaceView surfaceView = new SurfaceView(context); - if(fl_local.getChildCount() > 0) - { + if (fl_local.getChildCount() > 0) { fl_local.removeAllViews(); } // Add to the local container @@ -259,14 +241,14 @@ private void joinChannel(String channelId) // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + /*In the demo, the default is to enter as the anchor.*/ engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); // Setup video encoding configs engine.setVideoEncoderConfiguration(new VideoEncoderConfiguration( - ((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), - VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication)getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), + ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), + VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, VideoEncoderConfiguration.ORIENTATION_MODE.ORIENTATION_MODE_ADAPTIVE )); @@ -277,7 +259,7 @@ private void joinChannel(String channelId) option.publishMicrophoneTrack = true; option.publishCameraTrack = true; - /**Please configure accessToken in the string_config file. + /*Please configure accessToken in the string_config file. * A temporary token generated in Console. A temporary token is valid for 24 hours. For details, see * https://docs.agora.io/en/Agora%20Platform/token?platform=All%20Platforms#get-a-temporary-token * A token generated at the server. This applies to scenarios with high-security requirements. For details, see @@ -285,11 +267,10 @@ private void joinChannel(String channelId) int uid = new Random().nextInt(1000) + 100000; TokenUtils.gen(requireContext(), channelId, uid, ret -> { - /** Allows a user to join a channel. + /* Allows a user to join a channel. if you do not specify the uid, we will generate the uid for you*/ int res = engine.joinChannel(ret, channelId, uid, option); - if (res != 0) - { + if (res != 0) { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_i_rtc_engine_event_handler_1_1_error_code.html @@ -307,12 +288,11 @@ private void joinChannel(String channelId) * IRtcEngineEventHandler is an abstract class providing default implementation. * The SDK uses this class to report to the app on SDK runtime events. */ - private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() - { + private final IRtcEngineEventHandler iRtcEngineEventHandler = new IRtcEngineEventHandler() { /** * Error code description can be found at: - * en: https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror - * cn: https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror + * en: {@see https://api-ref.agora.io/en/video-sdk/android/4.x/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror} + * cn: {@see https://docs.agora.io/cn/video-call-4.x/API%20Reference/java_ng/API/class_irtcengineeventhandler.html#callback_irtcengineeventhandler_onerror} */ @Override public void onError(int err) { @@ -324,7 +304,7 @@ public void onError(int err) { if (Constants.ERR_INVALID_TOKEN == err) { showAlert(getString(R.string.token_invalid)); - } if (Constants.ERR_TOKEN_EXPIRED == err) { + } else { showAlert(getString(R.string.token_expired)); } } @@ -334,8 +314,7 @@ public void onError(int err) { * @param stats With this callback, the application retrieves the channel information, * such as the call duration and statistics.*/ @Override - public void onLeaveChannel(RtcStats stats) - { + public void onLeaveChannel(RtcStats stats) { super.onLeaveChannel(stats); Log.i(TAG, String.format("local user %d leaveChannel!", myUid)); showLongToast(String.format("local user %d leaveChannel!", myUid)); @@ -348,17 +327,14 @@ public void onLeaveChannel(RtcStats stats) * @param uid User ID * @param elapsed Time elapsed (ms) from the user calling joinChannel until this callback is triggered*/ @Override - public void onJoinChannelSuccess(String channel, int uid, int elapsed) - { + public void onJoinChannelSuccess(String channel, int uid, int elapsed) { Log.i(TAG, String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); showLongToast(String.format("onJoinChannelSuccess channel %s uid %d", channel, uid)); myUid = uid; joined = true; - handler.post(new Runnable() - { + handler.post(new Runnable() { @Override - public void run() - { + public void run() { join.setEnabled(true); join.setText(getString(R.string.leave)); fl_local.setReportUid(uid); @@ -442,8 +418,7 @@ public void onRemoteAudioStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time elapsed (ms) from the local user calling the joinChannel method until * the SDK triggers this callback.*/ @Override - public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) - { + public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapsed) { super.onRemoteVideoStateChanged(uid, state, reason, elapsed); Log.i(TAG, "onRemoteVideoStateChanged->" + uid + ", state->" + state + ", reason->" + reason); } @@ -453,28 +428,26 @@ public void onRemoteVideoStateChanged(int uid, int state, int reason, int elapse * @param elapsed Time delay (ms) from the local user calling joinChannel/setClientRole * until this callback is triggered.*/ @Override - public void onUserJoined(int uid, int elapsed) - { + public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + /*Check if the context is correct*/ Context context = getContext(); if (context == null) { return; } - if(remoteViews.containsKey(uid)){ - return; - } - else{ - handler.post(() -> - { - /**Display remote video stream*/ + if (!remoteViews.containsKey(uid)) { + handler.post(() -> { + /*Display remote video stream*/ SurfaceView surfaceView = null; // Create render view by RtcEngine surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); VideoReportLayout view = getAvailableView(); + if (view == null) { + return; + } view.setReportUid(uid); remoteViews.put(uid, view); // Add to the remote container @@ -496,19 +469,21 @@ public void onUserJoined(int uid, int elapsed) * USER_OFFLINE_BECOME_AUDIENCE(2): (Live broadcast only.) The client role switched from * the host to the audience.*/ @Override - public void onUserOffline(int uid, int reason) - { + public void onUserOffline(int uid, int reason) { Log.i(TAG, String.format("user %d offline! reason:%d", uid, reason)); showLongToast(String.format("user %d offline! reason:%d", uid, reason)); handler.post(new Runnable() { @Override public void run() { - /**Clear render view + /*Clear render view Note: The video will stay at its last frame, to completely remove it you will need to remove the SurfaceView from its parent*/ - engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); - remoteViews.get(uid).removeAllViews(); - remoteViews.remove(uid); + ViewGroup view = remoteViews.get(uid); + if (view != null) { + view.removeAllViews(); + remoteViews.remove(uid); + engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); + } } }); } @@ -543,17 +518,14 @@ public void onRemoteVideoStats(RemoteVideoStats stats) { }; private VideoReportLayout getAvailableView() { - if(fl_remote.getChildCount() == 0){ + if (fl_remote.getChildCount() == 0) { return fl_remote; - } - else if(fl_remote_2.getChildCount() == 0){ + } else if (fl_remote_2.getChildCount() == 0) { return fl_remote_2; - } - else if(fl_remote_3.getChildCount() == 0){ + } else if (fl_remote_3.getChildCount() == 0) { return fl_remote_3; - } - else{ - return fl_remote; + } else { + return null; } } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideoByToken.java b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideoByToken.java index 6622f42d0..9d8507b1e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideoByToken.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/examples/basic/JoinChannelVideoByToken.java @@ -41,13 +41,7 @@ /** * This demo demonstrates how to make a one-to-one video call */ -@Example( - index = 0, - group = BASIC, - name = R.string.item_joinvideo_by_token, - actionId = R.id.action_mainFragment_to_joinChannelVideoByToken, - tipsId = R.string.joinchannelvideoByToken -) +@Example(index = 0, group = BASIC, name = R.string.item_joinvideo_by_token, actionId = R.id.action_mainFragment_to_joinChannelVideoByToken, tipsId = R.string.joinchannelvideoByToken) public class JoinChannelVideoByToken extends BaseFragment implements View.OnClickListener { private static final String TAG = JoinChannelVideoByToken.class.getSimpleName(); @@ -87,40 +81,32 @@ public void onViewCreated(@NonNull View view, @Nullable Bundle savedInstanceStat private boolean createRtcEngine(String appId) { try { RtcEngineConfig config = new RtcEngineConfig(); - /** - * The context of Android Activity - */ + + // The context of Android Activity config.mContext = requireContext().getApplicationContext(); - /** - * The App ID issued to you by Agora. See How to get the App ID - */ + // The App ID issued to you by Agora. + // See How to get the App ID config.mAppId = appId; - /** Sets the channel profile of the Agora RtcEngine. - CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. - Use this profile in one-on-one calls or group calls, where all users can talk freely. - CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast - channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; - an audience can only receive streams.*/ + // Sets the channel profile of the Agora RtcEngine. + // CHANNEL_PROFILE_COMMUNICATION(0): (Default) The Communication profile. + // Use this profile in one-on-one calls or group calls, where all users can talk freely. + // CHANNEL_PROFILE_LIVE_BROADCASTING(1): The Live-Broadcast profile. Users in a live-broadcast + // channel have a role as either broadcaster or audience. A broadcaster can both send and receive streams; + // an audience can only receive streams. config.mChannelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING; - /** - * IRtcEngineEventHandler is an abstract class providing default implementation. - * The SDK uses this class to report to the app on SDK runtime events. - */ + // RtcEngineEventHandler is an abstract class providing default implementation. + // he SDK uses this class to report to the app on SDK runtime events. config.mEventHandler = iRtcEngineEventHandler; config.mAudioScenario = Constants.AudioScenario.getValue(Constants.AudioScenario.DEFAULT); config.mAreaCode = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getAreaCode(); engine = RtcEngine.create(config); - /** - * This parameter is for reporting the usages of APIExample to agora background. - * Generally, it is not necessary for you to set this parameter. - */ + // This parameter is for reporting the usages of APIExample to agora background. + // Generally, it is not necessary for you to set this parameter. engine.setParameters("{" + "\"rtc.report_app_scenario\":" - + "{" - + "\"appScenario\":" + 100 + "," + + "{" + "\"appScenario\":" + 100 + "," + "\"serviceType\":" + 11 + "," - + "\"appVersion\":\"" + RtcEngine.getSdkVersion() + "\"" - + "}" + + "\"appVersion\":\"" + RtcEngine.getSdkVersion() + "\"" + "}" + "}"); /* setting the local access point if the private cloud ip was set, otherwise the config will be invalid.*/ LocalAccessPointConfiguration localAccessPointConfiguration = ((MainApplication) getActivity().getApplication()).getGlobalSettings().getPrivateCloudConfig(); @@ -135,9 +121,9 @@ private boolean createRtcEngine(String appId) { return false; } - private void destroyRtcEngine(){ + private void destroyRtcEngine() { if (engine != null) { - /**leaveChannel and Destroy the RtcEngine instance*/ + // leaveChannel and Destroy the RtcEngine instance engine.leaveChannel(); RtcEngine.destroy(); engine = null; @@ -161,7 +147,7 @@ public void onClick(View v) { String channelId = et_channel.getText().toString(); String token = et_token.getText().toString(); - if(TextUtils.isEmpty(appId)){ + if (TextUtils.isEmpty(appId)) { showLongToast(getString(R.string.app_id_empty)); return; } @@ -205,7 +191,7 @@ private void joinChannel(String channelId, String token) { // Set audio route to microPhone engine.setDefaultAudioRoutetoSpeakerphone(true); - /**In the demo, the default is to enter as the anchor.*/ + // In the demo, the default is to enter as the anchor. engine.setClientRole(Constants.CLIENT_ROLE_BROADCASTER); // Enable video module engine.enableVideo(); @@ -214,8 +200,7 @@ private void joinChannel(String channelId, String token) { ((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingDimensionObject(), VideoEncoderConfiguration.FRAME_RATE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingFrameRate()), STANDARD_BITRATE, - VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()) - )); + VideoEncoderConfiguration.ORIENTATION_MODE.valueOf(((MainApplication) getActivity().getApplication()).getGlobalSettings().getVideoEncodingOrientation()))); ChannelMediaOptions option = new ChannelMediaOptions(); option.autoSubscribeAudio = true; @@ -223,8 +208,8 @@ private void joinChannel(String channelId, String token) { option.publishMicrophoneTrack = true; option.publishCameraTrack = true; - /** Allows a user to join a channel. - if you do not specify the uid, we will generate the uid for you*/ + //Allows a user to join a channel. + // if you do not specify the uid, we will generate the uid for you int res = engine.joinChannel(token, channelId, 0, option); if (res != 0) { // Usually happens with invalid parameters @@ -260,7 +245,8 @@ public void onError(int err) { if (Constants.ERR_INVALID_TOKEN == err) { showAlert(getString(R.string.token_invalid)); - } if (Constants.ERR_TOKEN_EXPIRED == err) { + } + if (Constants.ERR_TOKEN_EXPIRED == err) { showAlert(getString(R.string.token_expired)); } } @@ -388,22 +374,22 @@ public void onUserJoined(int uid, int elapsed) { super.onUserJoined(uid, elapsed); Log.i(TAG, "onUserJoined->" + uid); showLongToast(String.format("user %d joined!", uid)); - /**Check if the context is correct*/ + // Check if the context is correct Context context = getContext(); if (context == null) { return; } - if (remoteViews.containsKey(uid)) { - return; - } else { - handler.post(() -> - { - /**Display remote video stream*/ + if (!remoteViews.containsKey(uid)) { + handler.post(() -> { + // Display remote video stream SurfaceView surfaceView = null; // Create render view by RtcEngine surfaceView = new SurfaceView(context); surfaceView.setZOrderMediaOverlay(true); VideoReportLayout view = getAvailableView(); + if (view == null) { + return; + } view.setReportUid(uid); remoteViews.put(uid, view); // Add to the remote container @@ -431,12 +417,15 @@ public void onUserOffline(int uid, int reason) { handler.post(new Runnable() { @Override public void run() { - /**Clear render view - Note: The video will stay at its last frame, to completely remove it you will need to - remove the SurfaceView from its parent*/ - engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); - remoteViews.get(uid).removeAllViews(); - remoteViews.remove(uid); + //Clear render view + // Note: The video will stay at its last frame, to completely remove it you will need to + // remove the SurfaceView from its parent + ViewGroup view = remoteViews.get(uid); + if (view != null) { + view.removeAllViews(); + remoteViews.remove(uid); + engine.setupRemoteVideo(new VideoCanvas(null, RENDER_MODE_HIDDEN, uid)); + } } }); } @@ -478,7 +467,7 @@ private VideoReportLayout getAvailableView() { } else if (fl_remote_3.getChildCount() == 0) { return fl_remote_3; } else { - return fl_remote; + return null; } } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/AudioFileReader.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/AudioFileReader.java index 387463604..791b72987 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/AudioFileReader.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/AudioFileReader.java @@ -6,14 +6,35 @@ import java.io.IOException; import java.io.InputStream; +/** + * The type Audio file reader. + */ public class AudioFileReader { private static final String AUDIO_FILE = "output.raw"; + /** + * The constant SAMPLE_RATE. + */ public static final int SAMPLE_RATE = 44100; + /** + * The constant SAMPLE_NUM_OF_CHANNEL. + */ public static final int SAMPLE_NUM_OF_CHANNEL = 2; + /** + * The constant BITS_PER_SAMPLE. + */ public static final int BITS_PER_SAMPLE = 16; + /** + * The constant BYTE_PER_SAMPLE. + */ public static final float BYTE_PER_SAMPLE = 1.0f * BITS_PER_SAMPLE / 8 * SAMPLE_NUM_OF_CHANNEL; + /** + * The constant DURATION_PER_SAMPLE. + */ public static final float DURATION_PER_SAMPLE = 1000.0f / SAMPLE_RATE; // ms + /** + * The constant SAMPLE_COUNT_PER_MS. + */ public static final float SAMPLE_COUNT_PER_MS = SAMPLE_RATE * 1.0f / 1000; // ms private static final int BUFFER_SAMPLE_COUNT = (int) (SAMPLE_COUNT_PER_MS * 10); // 10ms sample count @@ -26,21 +47,33 @@ public class AudioFileReader { private InnerThread thread; private InputStream inputStream; - public AudioFileReader(Context context, OnAudioReadListener listener){ + /** + * Instantiates a new Audio file reader. + * + * @param context the context + * @param listener the listener + */ + public AudioFileReader(Context context, OnAudioReadListener listener) { this.context = context; this.audioReadListener = listener; } + /** + * Start. + */ public void start() { - if(thread == null){ + if (thread == null) { thread = new InnerThread(); thread.start(); } } - public void stop(){ + /** + * Stop. + */ + public void stop() { pushing = false; - if(thread != null){ + if (thread != null) { try { thread.join(); } catch (InterruptedException e) { @@ -51,11 +84,20 @@ public void stop(){ } } + /** + * The interface On audio read listener. + */ public interface OnAudioReadListener { + /** + * On audio read. + * + * @param buffer the buffer + * @param timestamp the timestamp + */ void onAudioRead(byte[] buffer, long timestamp); } - private class InnerThread extends Thread{ + private final class InnerThread extends Thread { @Override public void run() { @@ -68,17 +110,17 @@ public void run() { Process.setThreadPriority(Process.THREAD_PRIORITY_URGENT_AUDIO); pushing = true; - long start_time = System.currentTimeMillis();; + long start_time = System.currentTimeMillis(); int sent_audio_frames = 0; while (pushing) { - if(audioReadListener != null){ + if (audioReadListener != null) { audioReadListener.onAudioRead(readBuffer(), System.currentTimeMillis()); } - ++ sent_audio_frames; + ++sent_audio_frames; long next_frame_start_time = sent_audio_frames * BUFFER_DURATION + start_time; long now = System.currentTimeMillis(); - if(next_frame_start_time > now){ + if (next_frame_start_time > now) { long sleep_duration = next_frame_start_time - now; try { Thread.sleep(sleep_duration); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/ClassUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/ClassUtils.java index 0c281272d..58ac95e28 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/ClassUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/ClassUtils.java @@ -23,8 +23,14 @@ import dalvik.system.DexFile; import io.agora.api.example.BuildConfig; -public class ClassUtils -{ +/** + * The type Class utils. + */ +public final class ClassUtils { + + private ClassUtils() { + } + private static final String TAG = ClassUtils.class.getSimpleName(); private static final String EXTRACTED_NAME_EXT = ".classes"; private static final String EXTRACTED_SUFFIX = ".zip"; @@ -37,70 +43,54 @@ public class ClassUtils private static final int VM_WITH_MULTIDEX_VERSION_MAJOR = 2; private static final int VM_WITH_MULTIDEX_VERSION_MINOR = 1; - private static SharedPreferences getMultiDexPreferences(Context context) - { + private static SharedPreferences getMultiDexPreferences(Context context) { return context.getSharedPreferences(PREFS_FILE, Build.VERSION.SDK_INT < Build.VERSION_CODES.HONEYCOMB ? Context.MODE_PRIVATE : Context.MODE_PRIVATE | Context.MODE_MULTI_PROCESS); } /** * By specifying the package name, scan all ClassName contained under the package * - * @param context - * @param packageName + * @param context the context + * @param packageName the package name * @return Collection of all classes + * @throws NameNotFoundException the name not found exception + * @throws IOException the io exception + * @throws InterruptedException the interrupted exception */ - public static Set getFileNameByPackageName(Context context, final String packageName) throws PackageManager.NameNotFoundException, IOException, InterruptedException - { + public static Set getFileNameByPackageName(Context context, final String packageName) throws PackageManager.NameNotFoundException, IOException, InterruptedException { final Set classNames = new HashSet<>(); List paths = getSourcePaths(context); final CountDownLatch parserCtl = new CountDownLatch(paths.size()); - for (final String path : paths) - { - DefaultPoolExecutor.getInstance().execute(new Runnable() - { + for (final String path : paths) { + DefaultPoolExecutor.getInstance().execute(new Runnable() { @Override - public void run() - { + public void run() { DexFile dexfile = null; - try - { - if (path.endsWith(EXTRACTED_SUFFIX)) - { + try { + if (path.endsWith(EXTRACTED_SUFFIX)) { //NOT use new DexFile(path), because it will throw "permission error in /data/dalvik-cache" dexfile = DexFile.loadDex(path, path + ".tmp", 0); - } - else - { + } else { dexfile = new DexFile(path); } Enumeration dexEntries = dexfile.entries(); - while (dexEntries.hasMoreElements()) - { + while (dexEntries.hasMoreElements()) { String className = dexEntries.nextElement(); - if (className.startsWith(packageName)) - { + if (className.startsWith(packageName)) { classNames.add(className); } } - } - catch (Throwable ignore) - { + } catch (Throwable ignore) { Log.e("ARouter", "Scan map file in dex files made error.", ignore); - } - finally - { - if (null != dexfile) - { - try - { + } finally { + if (null != dexfile) { + try { dexfile.close(); - } - catch (Throwable ignore) - { + } catch (Throwable ignore) { } } @@ -121,11 +111,10 @@ public void run() * * @param context the application context * @return all the dex path - * @throws PackageManager.NameNotFoundException - * @throws IOException + * @throws NameNotFoundException the name not found exception + * @throws IOException the io exception */ - public static List getSourcePaths(Context context) throws PackageManager.NameNotFoundException, IOException - { + public static List getSourcePaths(Context context) throws PackageManager.NameNotFoundException, IOException { ApplicationInfo applicationInfo = context.getPackageManager().getApplicationInfo(context.getPackageName(), 0); File sourceApk = new File(applicationInfo.sourceDir); @@ -135,33 +124,27 @@ public static List getSourcePaths(Context context) throws PackageManager //the prefix of extracted file, ie: test.classes String extractedFilePrefix = sourceApk.getName() + EXTRACTED_NAME_EXT; - /** If MultiDex already supported by VM, we will not to load Classesx.zip from - * Secondary Folder, because there is none.*/ - if (!isVMMultidexCapable()) - { + //If MultiDex already supported by VM, we will not to load Classesx.zip from + //Secondary Folder, because there is none. + if (!isVMMultidexCapable()) { //the total dex numbers int totalDexNumber = getMultiDexPreferences(context).getInt(KEY_DEX_NUMBER, 1); File dexDir = new File(applicationInfo.dataDir, SECONDARY_FOLDER_NAME); - for (int secondaryNumber = 2; secondaryNumber <= totalDexNumber; secondaryNumber++) - { + for (int secondaryNumber = 2; secondaryNumber <= totalDexNumber; secondaryNumber++) { //for each dex file, ie: test.classes2.zip, test.classes3.zip... String fileName = extractedFilePrefix + secondaryNumber + EXTRACTED_SUFFIX; File extractedFile = new File(dexDir, fileName); - if (extractedFile.isFile()) - { + if (extractedFile.isFile()) { sourcePaths.add(extractedFile.getAbsolutePath()); //we ignore the verify zip part - } - else - { + } else { throw new IOException("Missing extracted secondary dex file '" + extractedFile.getPath() + "'"); } } } - if (BuildConfig.DEBUG) - { // Search instant run support only debuggable + if (BuildConfig.DEBUG) { // Search instant run support only debuggable sourcePaths.addAll(tryLoadInstantRunDexFile(applicationInfo)); } return sourcePaths; @@ -169,43 +152,36 @@ public static List getSourcePaths(Context context) throws PackageManager /** * Get instant run dex path, used to catch the branch usingApkSplits=false. + * + * @param applicationInfo Application info. + * @return instantRunSourcePaths */ - private static List tryLoadInstantRunDexFile(ApplicationInfo applicationInfo) - { + private static List tryLoadInstantRunDexFile(ApplicationInfo applicationInfo) { List instantRunSourcePaths = new ArrayList<>(); - if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP && null != applicationInfo.splitSourceDirs) - { + if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP && null != applicationInfo.splitSourceDirs) { // add the split apk, normally for InstantRun, and newest version. instantRunSourcePaths.addAll(Arrays.asList(applicationInfo.splitSourceDirs)); Log.d(TAG, "Found InstantRun support"); - } - else - { - try - { + } else { + try { // This man is reflection from Google instant run sdk, he will tell me where the dex files go. Class pathsByInstantRun = Class.forName("com.android.tools.fd.runtime.Paths"); Method getDexFileDirectory = pathsByInstantRun.getMethod("getDexFileDirectory", String.class); String instantRunDexPath = (String) getDexFileDirectory.invoke(null, applicationInfo.packageName); File instantRunFilePath = new File(instantRunDexPath); - if (instantRunFilePath.exists() && instantRunFilePath.isDirectory()) - { + if (instantRunFilePath.exists() && instantRunFilePath.isDirectory()) { File[] dexFile = instantRunFilePath.listFiles(); - for (File file : dexFile) - { - if (null != file && file.exists() && file.isFile() && file.getName().endsWith(".dex")) - { + for (File file : dexFile) { + if (null != file && file.exists() && file.isFile() && file.getName().endsWith(".dex")) { instantRunSourcePaths.add(file.getAbsolutePath()); } } Log.d(TAG, "Found InstantRun support"); } - } - catch (Exception e) - { + } catch (Exception e) { Log.e(TAG, "InstantRun support error, " + e.getMessage()); } } @@ -219,45 +195,33 @@ private static List tryLoadInstantRunDexFile(ApplicationInfo application * * @return true if the VM handles multidex */ - private static boolean isVMMultidexCapable() - { + private static boolean isVMMultidexCapable() { boolean isMultidexCapable = false; String vmName = null; - try - { - if (isYunOS()) - { // YunOS need special judgment + try { + if (isYunOS()) { // YunOS need special judgment vmName = "'YunOS'"; isMultidexCapable = Integer.valueOf(System.getProperty("ro.build.version.sdk")) >= 21; - } - else - { // Native Android system + } else { // Native Android system vmName = "'Android'"; String versionString = System.getProperty("java.vm.version"); - if (versionString != null) - { + if (versionString != null) { Matcher matcher = Pattern.compile("(\\d+)\\.(\\d+)(\\.\\d+)?").matcher(versionString); - if (matcher.matches()) - { - try - { + if (matcher.matches()) { + try { int major = Integer.parseInt(matcher.group(1)); int minor = Integer.parseInt(matcher.group(2)); - isMultidexCapable = (major > VM_WITH_MULTIDEX_VERSION_MAJOR) - || ((major == VM_WITH_MULTIDEX_VERSION_MAJOR) - && (minor >= VM_WITH_MULTIDEX_VERSION_MINOR)); - } - catch (NumberFormatException ignore) - { + isMultidexCapable = major > VM_WITH_MULTIDEX_VERSION_MAJOR + || major == VM_WITH_MULTIDEX_VERSION_MAJOR + && minor >= VM_WITH_MULTIDEX_VERSION_MINOR; + } catch (NumberFormatException ignore) { // let isMultidexCapable be false } } } } - } - catch (Exception ignore) - { + } catch (Exception ignore) { } @@ -267,18 +231,16 @@ private static boolean isVMMultidexCapable() /** * Determine whether the system is a YunOS system + * + * @return true or false. */ - private static boolean isYunOS() - { - try - { + private static boolean isYunOS() { + try { String version = System.getProperty("ro.yunos.version"); String vmName = System.getProperty("java.vm.name"); - return (vmName != null && vmName.toLowerCase().contains("lemur")) - || (version != null && version.trim().length() > 0); - } - catch (Exception ignore) - { + return vmName != null && vmName.toLowerCase().contains("lemur") + || version != null && version.trim().length() > 0; + } catch (Exception ignore) { return false; } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/CommonUtil.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/CommonUtil.java index b78399dc2..8fd287110 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/CommonUtil.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/CommonUtil.java @@ -6,12 +6,23 @@ import android.widget.EditText; /** + * The type Common util. + * * @author cjw */ -public class CommonUtil { +public final class CommonUtil { - public static void hideInputBoard(Activity activity, EditText editText) - { + private CommonUtil() { + + } + + /** + * Hide input board. + * + * @param activity the activity + * @param editText the edit text + */ + public static void hideInputBoard(Activity activity, EditText editText) { InputMethodManager imm = (InputMethodManager) activity.getSystemService(Context.INPUT_METHOD_SERVICE); imm.hideSoftInputFromWindow(editText.getWindowToken(), 0); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultPoolExecutor.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultPoolExecutor.java index 324fc3087..a5cd77878 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultPoolExecutor.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultPoolExecutor.java @@ -16,10 +16,9 @@ * Executors * * @version 1.0 - * @since 16/4/28 涓嬪崍4:07 + * @since 16 /4/28 涓嬪崍4:07 */ -public class DefaultPoolExecutor extends ThreadPoolExecutor -{ +public final class DefaultPoolExecutor extends ThreadPoolExecutor { private static final String TAG = DefaultPoolExecutor.class.getSimpleName(); // Thread args private static final int CPU_COUNT = Runtime.getRuntime().availableProcessors(); @@ -29,14 +28,15 @@ public class DefaultPoolExecutor extends ThreadPoolExecutor private static DefaultPoolExecutor instance; - public static DefaultPoolExecutor getInstance() - { - if (null == instance) - { - synchronized (DefaultPoolExecutor.class) - { - if (null == instance) - { + /** + * Gets instance. + * + * @return the instance + */ + public static DefaultPoolExecutor getInstance() { + if (null == instance) { + synchronized (DefaultPoolExecutor.class) { + if (null == instance) { instance = new DefaultPoolExecutor( INIT_THREAD_COUNT, MAX_THREAD_COUNT, @@ -50,13 +50,10 @@ public static DefaultPoolExecutor getInstance() return instance; } - private DefaultPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue workQueue, ThreadFactory threadFactory) - { - super(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue, threadFactory, new RejectedExecutionHandler() - { + private DefaultPoolExecutor(int corePoolSize, int maximumPoolSize, long keepAliveTime, TimeUnit unit, BlockingQueue workQueue, ThreadFactory threadFactory) { + super(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue, threadFactory, new RejectedExecutionHandler() { @Override - public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) - { + public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) { Log.e(TAG, "Task rejected, too many task!"); } }); @@ -67,30 +64,20 @@ public void rejectedExecution(Runnable r, ThreadPoolExecutor executor) * @param t the exception that caused termination, or null if */ @Override - protected void afterExecute(Runnable r, Throwable t) - { + protected void afterExecute(Runnable r, Throwable t) { super.afterExecute(r, t); - if (t == null && r instanceof Future) - { - try - { + if (t == null && r instanceof Future) { + try { ((Future) r).get(); - } - catch (CancellationException ce) - { + } catch (CancellationException ce) { t = ce; - } - catch (ExecutionException ee) - { + } catch (ExecutionException ee) { t = ee.getCause(); - } - catch (InterruptedException ie) - { + } catch (InterruptedException ie) { Thread.currentThread().interrupt(); // ignore/reset } } - if (t != null) - { + if (t != null) { Log.w(TAG, "Running task appeared exception! Thread [" + Thread.currentThread().getName() + "], because [" + t.getMessage() + "]\n" + TextUtils.formatStackTrace(t.getStackTrace())); } } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultThreadFactory.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultThreadFactory.java index 7a6f99fca..407dd71c4 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultThreadFactory.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/DefaultThreadFactory.java @@ -12,37 +12,40 @@ * * @author zhilong Contact me. * @version 1.0 - * @since 15/12/25 涓婂崍10:51 + * @since 15 /12/25 涓婂崍10:51 */ -public class DefaultThreadFactory implements ThreadFactory -{ +public class DefaultThreadFactory implements ThreadFactory { private static final String TAG = DefaultThreadFactory.class.getSimpleName(); - private static final AtomicInteger poolNumber = new AtomicInteger(1); + private static final AtomicInteger POOL_NUMBER = new AtomicInteger(1); private final AtomicInteger threadNumber = new AtomicInteger(1); private final ThreadGroup group; private final String namePrefix; - public DefaultThreadFactory() - { + /** + * Instantiates a new Default thread factory. + */ + public DefaultThreadFactory() { SecurityManager s = System.getSecurityManager(); - group = (s != null) ? s.getThreadGroup() : - Thread.currentThread().getThreadGroup(); - namePrefix = "ARouter task pool No." + poolNumber.getAndIncrement() + ", thread No."; + group = (s != null) ? s.getThreadGroup() + : Thread.currentThread().getThreadGroup(); + namePrefix = "ARouter task pool No." + POOL_NUMBER.getAndIncrement() + ", thread No."; } - public Thread newThread(@NonNull Runnable runnable) - { + /** + * New thread. + * @param runnable a runnable to be executed by new thread instance + * @return Thread + */ + public Thread newThread(@NonNull Runnable runnable) { String threadName = namePrefix + threadNumber.getAndIncrement(); Log.i(TAG, "Thread production, name is [" + threadName + "]"); Thread thread = new Thread(group, runnable, threadName, 0); - if (thread.isDaemon()) - { //Make non-background thread + if (thread.isDaemon()) { //Make non-background thread thread.setDaemon(false); } - if (thread.getPriority() != Thread.NORM_PRIORITY) - { + if (thread.getPriority() != Thread.NORM_PRIORITY) { thread.setPriority(Thread.NORM_PRIORITY); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileKtUtils.kt b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileKtUtils.kt index b0ff18b1a..8b7f20ce6 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileKtUtils.kt +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileKtUtils.kt @@ -34,9 +34,21 @@ import java.io.FileOutputStream import java.io.IOException import java.io.InputStreamReader +/** + * File kt utils + * + * @constructor Create empty File kt utils + */ object FileKtUtils { - val TAG = "FileUtils" + private const val TAG = "FileUtils" + /** + * Get assets string + * + * @param context + * @param path + * @return + */ fun getAssetsString(context: Context, path: String): String { val sb = StringBuilder() var isr: InputStreamReader? = null @@ -56,20 +68,27 @@ object FileKtUtils { try { isr.close() } catch (e: IOException) { - e.printStackTrace() + Log.e(TAG, "", e) } } if (br != null) { try { br.close() } catch (e: IOException) { - e.printStackTrace() + Log.e(TAG, "", e) } } } return sb.toString() } + /** + * Copy assets + * + * @param context + * @param assetsPath + * @param targetPath + */ fun copyAssets(context: Context, assetsPath: String, targetPath: String) { // 鑾峰彇assets鐩綍assetDir涓嬩竴绾ф墍鏈夋枃浠朵互鍙婃枃浠跺す val fileNames = context.resources.assets.list(assetsPath) diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileUtils.java index f3c1210fa..f15f4c2ec 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/FileUtils.java @@ -10,10 +10,27 @@ import java.io.IOException; import java.io.InputStream; -public class FileUtils { +/** + * The type File utils. + */ +public final class FileUtils { + private FileUtils() { + + } + + /** + * The constant SEPARATOR. + */ public static final String SEPARATOR = File.separator; + /** + * Copy files from assets. + * + * @param context the context + * @param assetsPath the assets path + * @param storagePath the storage path + */ public static void copyFilesFromAssets(Context context, String assetsPath, String storagePath) { String temp = ""; @@ -32,29 +49,29 @@ public static void copyFilesFromAssets(Context context, String assetsPath, Strin AssetManager assetManager = context.getAssets(); try { File file = new File(storagePath); - if (!file.exists()) {//濡傛灉鏂囦欢澶逛笉瀛樺湪锛屽垯鍒涘缓鏂扮殑鏂囦欢澶 + if (!file.exists()) { //濡傛灉鏂囦欢澶逛笉瀛樺湪锛屽垯鍒涘缓鏂扮殑鏂囦欢澶 file.mkdirs(); } // 鑾峰彇assets鐩綍涓嬬殑鎵鏈夋枃浠跺強鐩綍鍚 String[] fileNames = assetManager.list(assetsPath); - if (fileNames.length > 0) {//濡傛灉鏄洰褰 apk + if (fileNames.length > 0) { //濡傛灉鏄洰褰 apk for (String fileName : fileNames) { if (!TextUtils.isEmpty(assetsPath)) { - temp = assetsPath + SEPARATOR + fileName;//琛ュ叏assets璧勬簮璺緞 + temp = assetsPath + SEPARATOR + fileName; //琛ュ叏assets璧勬簮璺緞 } String[] childFileNames = assetManager.list(temp); - if (!TextUtils.isEmpty(temp) && childFileNames.length > 0) {//鍒ゆ柇鏄枃浠惰繕鏄枃浠跺す锛氬鏋滄槸鏂囦欢澶 + if (!TextUtils.isEmpty(temp) && childFileNames.length > 0) { //鍒ゆ柇鏄枃浠惰繕鏄枃浠跺す锛氬鏋滄槸鏂囦欢澶 copyFilesFromAssets(context, temp, storagePath + SEPARATOR + fileName); - } else {//濡傛灉鏄枃浠 + } else { //濡傛灉鏄枃浠 InputStream inputStream = assetManager.open(temp); readInputStream(storagePath + SEPARATOR + fileName, inputStream); } } - } else {//濡傛灉鏄枃浠 doc_test.txt鎴栬卆pk/app_test.apk + } else { //濡傛灉鏄枃浠 doc_test.txt鎴栬卆pk/app_test.apk InputStream inputStream = assetManager.open(assetsPath); - if (assetsPath.contains(SEPARATOR)) {//apk/app_test.apk + if (assetsPath.contains(SEPARATOR)) { //apk/app_test.apk assetsPath = assetsPath.substring(assetsPath.lastIndexOf(SEPARATOR), assetsPath.length()); } readInputStream(storagePath + SEPARATOR + assetsPath, inputStream); @@ -81,11 +98,11 @@ public static void readInputStream(String storagePath, InputStream inputStream) byte[] buffer = new byte[inputStream.available()]; // 3.寮濮嬭鏂囦欢 int lenght = 0; - while ((lenght = inputStream.read(buffer)) != -1) {// 寰幆浠庤緭鍏ユ祦璇诲彇buffer瀛楄妭 + while ((lenght = inputStream.read(buffer)) != -1) { // 寰幆浠庤緭鍏ユ祦璇诲彇buffer瀛楄妭 // 灏咮uffer涓殑鏁版嵁鍐欏埌outputStream瀵硅薄涓 fos.write(buffer, 0, lenght); } - fos.flush();// 鍒锋柊缂撳啿鍖 + fos.flush(); // 鍒锋柊缂撳啿鍖 // 4.鍏抽棴娴 fos.close(); inputStream.close(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TextUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TextUtils.java index 0f8034b3a..0e42d87ce 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TextUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TextUtils.java @@ -1,8 +1,18 @@ package io.agora.api.example.utils; -public class TextUtils { +/** + * The type Text utils. + */ +public final class TextUtils { + + private TextUtils() { + + } /** - * Print thread stack + * Format stack trace string. + * + * @param stackTrace the stack trace + * @return the string */ public static String formatStackTrace(StackTraceElement[] stackTrace) { StringBuilder sb = new StringBuilder(); diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TokenUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TokenUtils.java index b8cbea3cd..b33293ac0 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TokenUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/TokenUtils.java @@ -24,21 +24,36 @@ import okhttp3.ResponseBody; import okhttp3.logging.HttpLoggingInterceptor; -public class TokenUtils { +/** + * The type Token utils. + */ +public final class TokenUtils { private static final String TAG = "TokenGenerator"; - private final static OkHttpClient client; + private final static OkHttpClient CLIENT; + + private TokenUtils() { + + } static { HttpLoggingInterceptor interceptor = new HttpLoggingInterceptor(); interceptor.setLevel(HttpLoggingInterceptor.Level.BODY); - client = new OkHttpClient.Builder() + CLIENT = new OkHttpClient.Builder() .addInterceptor(interceptor) .build(); } - public static void gen(Context context, String channelName, int uid, OnTokenGenCallback onGetToken){ + /** + * Gen. + * + * @param context the context + * @param channelName the channel name + * @param uid the uid + * @param onGetToken the on get token + */ + public static void gen(Context context, String channelName, int uid, OnTokenGenCallback onGetToken) { gen(context.getString(R.string.agora_app_id), context.getString(R.string.agora_app_certificate), channelName, uid, ret -> { - if(onGetToken != null){ + if (onGetToken != null) { runOnUiThread(() -> { onGetToken.onTokenGen(ret); }); @@ -53,17 +68,17 @@ public static void gen(Context context, String channelName, int uid, OnTokenGen }); } - private static void runOnUiThread(@NonNull Runnable runnable){ - if(Thread.currentThread() == Looper.getMainLooper().getThread()){ + private static void runOnUiThread(@NonNull Runnable runnable) { + if (Thread.currentThread() == Looper.getMainLooper().getThread()) { runnable.run(); - }else{ + } else { new Handler(Looper.getMainLooper()).post(runnable); } } - private static void gen(String appId, String certificate, String channelName, int uid, OnTokenGenCallback onGetToken, OnTokenGenCallback onError) { - if(TextUtils.isEmpty(appId) || TextUtils.isEmpty(certificate) || TextUtils.isEmpty(channelName)){ - if(onError != null){ + private static void gen(String appId, String certificate, String channelName, int uid, OnTokenGenCallback onGetToken, OnTokenGenCallback onError) { + if (TextUtils.isEmpty(appId) || TextUtils.isEmpty(certificate) || TextUtils.isEmpty(channelName)) { + if (onError != null) { onError.onTokenGen(new IllegalArgumentException("appId=" + appId + ", certificate=" + certificate + ", channelName=" + channelName)); } return; @@ -73,13 +88,13 @@ private static void gen(String appId, String certificate, String channelName, in postBody.put("appId", appId); postBody.put("appCertificate", certificate); postBody.put("channelName", channelName); - postBody.put("expire", 900);// s + postBody.put("expire", 900); // s postBody.put("src", "Android"); postBody.put("ts", System.currentTimeMillis() + ""); postBody.put("type", 1); // 1: RTC Token ; 2: RTM Token postBody.put("uid", uid + ""); } catch (JSONException e) { - if(onError != null){ + if (onError != null) { onError.onTokenGen(e); } } @@ -89,10 +104,10 @@ private static void gen(String appId, String certificate, String channelName, in .addHeader("Content-Type", "application/json") .post(RequestBody.create(postBody.toString(), null)) .build(); - client.newCall(request).enqueue(new Callback() { + CLIENT.newCall(request).enqueue(new Callback() { @Override public void onFailure(@NonNull Call call, @NonNull IOException e) { - if(onError != null){ + if (onError != null) { onError.onTokenGen(e); } } @@ -105,11 +120,11 @@ public void onResponse(@NonNull Call call, @NonNull Response response) throws IO JSONObject jsonObject = new JSONObject(body.string()); JSONObject data = jsonObject.optJSONObject("data"); String token = Objects.requireNonNull(data).optString("token"); - if(onGetToken != null){ + if (onGetToken != null) { onGetToken.onTokenGen(token); } } catch (Exception e) { - if(onError != null){ + if (onError != null) { onError.onTokenGen(e); } } @@ -118,7 +133,17 @@ public void onResponse(@NonNull Call call, @NonNull Response response) throws IO }); } + /** + * The interface On token gen callback. + * + * @param the type parameter + */ public interface OnTokenGenCallback { + /** + * On token gen. + * + * @param token the token + */ void onTokenGen(T token); } diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/VideoFileReader.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/VideoFileReader.java index 04901702d..e85b460e3 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/VideoFileReader.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/VideoFileReader.java @@ -5,6 +5,9 @@ import java.io.IOException; import java.io.InputStream; +/** + * The type Video file reader. + */ public class VideoFileReader { private final String RAW_VIDEO_PATH = "sample.yuv"; private final int RAW_VIDEO_WIDTH = 320; @@ -21,47 +24,81 @@ public class VideoFileReader { private InnerThread thread; private final int trackId; - public VideoFileReader(Context context, OnVideoReadListener listener){ + /** + * Instantiates a new Video file reader. + * + * @param context the context + * @param listener the listener + */ + public VideoFileReader(Context context, OnVideoReadListener listener) { this(context, 0, listener); } - public VideoFileReader(Context context, int trackId, OnVideoReadListener listener){ + /** + * Instantiates a new Video file reader. + * + * @param context the context + * @param trackId the track id + * @param listener the listener + */ + public VideoFileReader(Context context, int trackId, OnVideoReadListener listener) { this.trackId = trackId; this.context = context.getApplicationContext(); this.videoReadListener = listener; } + /** + * Gets track id. + * + * @return the track id + */ public int getTrackId() { return trackId; } - public final void start(){ - if(thread != null){ + /** + * Start. + */ + public final void start() { + if (thread != null) { return; } thread = new InnerThread(); thread.start(); } - public final void stop(){ - if(thread != null){ + /** + * Stop. + */ + public final void stop() { + if (thread != null) { pushing = false; try { thread.join(); } catch (InterruptedException e) { e.printStackTrace(); - }finally { + } finally { thread = null; } } } + /** + * The interface On video read listener. + */ public interface OnVideoReadListener { + /** + * On video read. + * + * @param buffer the buffer + * @param width the width + * @param height the height + */ void onVideoRead(byte[] buffer, int width, int height); } - private class InnerThread extends Thread { + private final class InnerThread extends Thread { @Override public void run() { super.run(); @@ -84,7 +121,7 @@ public void run() { } catch (IOException e) { e.printStackTrace(); } - if(videoReadListener != null){ + if (videoReadListener != null) { videoReadListener.onVideoRead(buffer, RAW_VIDEO_WIDTH, RAW_VIDEO_HEIGHT); } long consume = System.nanoTime() - start; diff --git a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/YUVUtils.java b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/YUVUtils.java index b3f422350..131fc9656 100644 --- a/Android/APIExample/app/src/main/java/io/agora/api/example/utils/YUVUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/api/example/utils/YUVUtils.java @@ -12,82 +12,114 @@ import android.graphics.Rect; import android.graphics.YuvImage; import android.renderscript.Allocation; -import android.renderscript.Type.Builder; import android.renderscript.RenderScript; import android.renderscript.ScriptIntrinsicBlur; import android.renderscript.ScriptIntrinsicYuvToRGB; +import android.renderscript.Type.Builder; import java.io.ByteArrayOutputStream; import java.io.IOException; import java.nio.ByteBuffer; -public class YUVUtils { +/** + * The type Yuv utils. + */ +public final class YUVUtils { + + private YUVUtils() { + } + + /** + * Encode i 420. + * + * @param i420 the 420 + * @param argb the argb + * @param width the width + * @param height the height + */ public static void encodeI420(byte[] i420, int[] argb, int width, int height) { final int frameSize = width * height; int yIndex = 0; // Y start index int uIndex = frameSize; // U statt index - int vIndex = frameSize * 5 / 4; // V start index: w*h*5/4 + int vIndex = frameSize * 5 / 4; // v start index: w*h*5/4 - int a, R, G, B, Y, U, V; + int r, g, b, y, u, v; int index = 0; for (int j = 0; j < height; j++) { for (int i = 0; i < width; i++) { - a = (argb[index] & 0xff000000) >> 24; // is not used obviously - R = (argb[index] & 0xff0000) >> 16; - G = (argb[index] & 0xff00) >> 8; - B = (argb[index] & 0xff) >> 0; + r = (argb[index] & 0xff0000) >> 16; + g = (argb[index] & 0xff00) >> 8; + b = (argb[index] & 0xff) >> 0; // well known RGB to YUV algorithm - Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16; - U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128; - V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128; + y = ((66 * r + 129 * g + 25 * b + 128) >> 8) + 16; + u = ((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128; + v = ((112 * r - 94 * g - 18 * b + 128) >> 8) + 128; // I420(YUV420p) -> YYYYYYYY UU VV - i420[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y)); + i420[yIndex++] = (byte) ((y < 0) ? 0 : ((y > 255) ? 255 : y)); if (j % 2 == 0 && i % 2 == 0) { - i420[uIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U)); - i420[vIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V)); + i420[uIndex++] = (byte) ((u < 0) ? 0 : ((u > 255) ? 255 : u)); + i420[vIndex++] = (byte) ((v < 0) ? 0 : ((v > 255) ? 255 : v)); } index++; } } } + /** + * Encode nv 21. + * + * @param yuv420sp the yuv 420 sp + * @param argb the argb + * @param width the width + * @param height the height + */ public static void encodeNV21(byte[] yuv420sp, int[] argb, int width, int height) { final int frameSize = width * height; int yIndex = 0; int uvIndex = frameSize; - int a, R, G, B, Y, U, V; + int r, g, b, y, u, v; int index = 0; for (int j = 0; j < height; j++) { for (int i = 0; i < width; i++) { - a = (argb[index] & 0xff000000) >> 24; // a is not used obviously - R = (argb[index] & 0xff0000) >> 16; - G = (argb[index] & 0xff00) >> 8; - B = (argb[index] & 0xff) >> 0; + r = (argb[index] & 0xff0000) >> 16; + g = (argb[index] & 0xff00) >> 8; + b = (argb[index] & 0xff) >> 0; // well known RGB to YUV algorithm - Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16; - U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128; - V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128; + y = ((66 * r + 129 * g + 25 * b + 128) >> 8) + 16; + u = ((-38 * r - 74 * g + 112 * b + 128) >> 8) + 128; + v = ((112 * r - 94 * g - 18 * b + 128) >> 8) + 128; // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2 // meaning for every 4 Y pixels there are 1 V and 1 U. Note the sampling is every other // pixel AND every other scanline. - yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y)); + yuv420sp[yIndex++] = (byte) ((y < 0) ? 0 : ((y > 255) ? 255 : y)); if (j % 2 == 0 && index % 2 == 0) { - yuv420sp[uvIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V)); - yuv420sp[uvIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U)); + yuv420sp[uvIndex++] = (byte) ((v < 0) ? 0 : ((v > 255) ? 255 : v)); + yuv420sp[uvIndex++] = (byte) ((u < 0) ? 0 : ((u > 255) ? 255 : u)); } index++; } } } + /** + * Swap yu 12 to yuv 420 sp. + * + * @param yu12bytes the yu 12 bytes + * @param i420bytes the 420 bytes + * @param width the width + * @param height the height + * @param yStride the y stride + * @param uStride the u stride + * @param vStride the v stride + */ public static void swapYU12toYUV420SP(byte[] yu12bytes, byte[] i420bytes, int width, int height, int yStride, int uStride, int vStride) { System.arraycopy(yu12bytes, 0, i420bytes, 0, yStride * height); int startPos = yStride * height; @@ -99,14 +131,29 @@ public static void swapYU12toYUV420SP(byte[] yu12bytes, byte[] i420bytes, int wi } } - public static Bitmap i420ToBitmap(int width, int height, int rotation, int bufferLength, byte[] buffer, int yStride, int uStride, int vStride) { - byte[] NV21 = new byte[bufferLength]; - swapYU12toYUV420SP(buffer, NV21, width, height, yStride, uStride, vStride); + /** + * 420 to bitmap bitmap. + * + * @param width the width + * @param height the height + * @param rotation the rotation + * @param bufferLength the buffer length + * @param buffer the buffer + * @param yStride the y stride + * @param uStride the u stride + * @param vStride the v stride + * @return the bitmap + */ + public static Bitmap i420ToBitmap(int width, int height, int rotation, + int bufferLength, byte[] buffer, + int yStride, int uStride, int vStride) { + byte[] nv21 = new byte[bufferLength]; + swapYU12toYUV420SP(buffer, nv21, width, height, yStride, uStride, vStride); ByteArrayOutputStream baos = new ByteArrayOutputStream(); int[] strides = {yStride, yStride}; - YuvImage image = new YuvImage(NV21, ImageFormat.NV21, width, height, strides); + YuvImage image = new YuvImage(nv21, ImageFormat.NV21, width, height, strides); image.compressToJpeg( new Rect(0, 0, image.getWidth(), image.getHeight()), @@ -118,13 +165,20 @@ public static Bitmap i420ToBitmap(int width, int height, int rotation, int buffe byte[] bytes = baos.toByteArray(); try { baos.close(); - } - catch (IOException e) { + } catch (IOException e) { e.printStackTrace(); } return BitmapFactory.decodeByteArray(bytes, 0, bytes.length); } + /** + * Blur bitmap. + * + * @param context the context + * @param image the image + * @param radius the radius + * @return the bitmap + */ public static Bitmap blur(Context context, Bitmap image, float radius) { RenderScript rs = RenderScript.create(context); Bitmap outputBitmap = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888); @@ -142,6 +196,14 @@ public static Bitmap blur(Context context, Bitmap image, float radius) { return outputBitmap; } + /** + * Bitmap to i 420 byte [ ]. + * + * @param inputWidth the input width + * @param inputHeight the input height + * @param scaled the scaled + * @return the byte [ ] + */ public static byte[] bitmapToI420(int inputWidth, int inputHeight, Bitmap scaled) { int[] argb = new int[inputWidth * inputHeight]; scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight); @@ -151,6 +213,16 @@ public static byte[] bitmapToI420(int inputWidth, int inputHeight, Bitmap scaled return yuv; } + /** + * To wrapped i 420 byte [ ]. + * + * @param bufferY the buffer y + * @param bufferU the buffer u + * @param bufferV the buffer v + * @param width the width + * @param height the height + * @return the byte [ ] + */ public static byte[] toWrappedI420(ByteBuffer bufferY, ByteBuffer bufferU, ByteBuffer bufferV, @@ -183,10 +255,16 @@ public static byte[] toWrappedI420(ByteBuffer bufferY, return out; } + /** * I420杞琻v21 + * + * @param data the data + * @param width the width + * @param height the height + * @return the byte [ ] */ - public static byte[] I420ToNV21(byte[] data, int width, int height) { + public static byte[] i420ToNV21(byte[] data, int width, int height) { byte[] ret = new byte[data.length]; int total = width * height; @@ -202,7 +280,16 @@ public static byte[] I420ToNV21(byte[] data, int width, int height) { return ret; } - public static Bitmap NV21ToBitmap(Context context, byte[] nv21, int width, int height) { + /** + * Nv 21 to bitmap bitmap. + * + * @param context the context + * @param nv21 the nv 21 + * @param width the width + * @param height the height + * @return the bitmap + */ + public static Bitmap nv21ToBitmap(Context context, byte[] nv21, int width, int height) { RenderScript rs = RenderScript.create(context); ScriptIntrinsicYuvToRGB yuvToRgbIntrinsic = ScriptIntrinsicYuvToRGB.create(rs, U8_4(rs)); Builder yuvType = null; diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPI.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPI.kt index 5de913770..c14e73ae2 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPI.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPI.kt @@ -31,13 +31,40 @@ import io.agora.base.VideoFrame import io.agora.rtc2.Constants import io.agora.rtc2.RtcEngine +/** + * Version + */ const val VERSION = "1.0.3" +/** + * Capture mode + * + * @constructor Create empty Capture mode + */ enum class CaptureMode{ + /** + * Agora + * + * @constructor Create empty Agora + */ Agora, // 浣跨敤澹扮綉鍐呴儴鐨勭ゼ鏁版嵁鎺ュ彛杩涜澶勭悊 + + /** + * Custom + * + * @constructor Create empty Custom + */ Custom // 鑷畾涔夋ā寮忥紝闇瑕佽嚜宸辫皟鐢╫nFrame鎺ュ彛灏嗗師濮嬭棰戝抚浼犵粰BeautyAPI鍋氬鐞 } +/** + * Event callback + * + * @property onBeautyStats + * @property onEffectInitialized + * @property onEffectDestroyed + * @constructor Create empty Event callback + */ data class EventCallback( /** * 缁熻鏁版嵁鍥炶皟锛屾瘡澶勭悊瀹屼竴甯у悗浼氬洖璋冧竴娆 @@ -57,27 +84,83 @@ data class EventCallback( val onEffectDestroyed: (()->Unit)? = null ) +/** + * Beauty stats + * + * @property minCostMs + * @property maxCostMs + * @property averageCostMs + * @constructor Create empty Beauty stats + */ data class BeautyStats( val minCostMs:Long, // 缁熻鍖洪棿鍐呯殑鏈灏忓 val maxCostMs: Long, // 缁熻鍖洪棿鍐呯殑鏈澶у val averageCostMs: Long // 缁熻鍖洪棿鍐呯殑骞冲潎鍊 ) +/** + * Mirror mode + * + * @constructor Create empty Mirror mode + */ enum class MirrorMode { // 娌℃湁闀滃儚姝e父鐢婚潰鐨勫畾涔夛細鍓嶇疆鎷嶅埌鐢婚潰鍜屾墜鏈虹湅鍒扮敾闈㈡槸宸﹀彸涓嶄竴鑷寸殑锛屽悗缃媿鍒扮敾闈㈠拰鎵嬫満鐪嬪埌鐢婚潰鏄乏鍙充竴鑷寸殑 + /** + * Mirror Local Remote + * + * @constructor Create empty Mirror Local Remote + */ MIRROR_LOCAL_REMOTE, //鏈湴杩滅閮介暅鍍忥紝鍓嶇疆榛樿锛屾湰鍦板拰杩滅璐寸焊閮芥甯 + + /** + * Mirror Local Only + * + * @constructor Create empty Mirror Local Only + */ MIRROR_LOCAL_ONLY, // 浠呮湰鍦伴暅鍍忥紝杩滅涓嶉暅鍍忥紝锛岃繙绔创绾告甯革紝鏈湴璐寸焊闀滃儚銆傜敤浜庢墦鐢佃瘽鍦烘櫙锛岀數鍟嗙洿鎾満鏅(淇濊瘉鐢靛晢鐩存挱鍚庨潰鐨勫憡绀虹墝鏂囧瓧鏄鐨)锛涜繖绉嶆ā寮忓洜涓烘湰鍦拌繙绔槸鍙嶇殑锛屾墍浠ヨ偗瀹氭湁涓杈圭殑鏂囧瓧璐寸焊鏂瑰悜浼氭槸鍙嶇殑 + + /** + * Mirror Remote Only + * + * @constructor Create empty Mirror Remote Only + */ MIRROR_REMOTE_ONLY, // 浠呰繙绔暅鍍忥紝鏈湴涓嶉暅鍍忥紝杩滅璐寸焊姝e父锛屾湰鍦拌创绾搁暅鍍 + + /** + * Mirror None + * + * @constructor Create empty Mirror None + */ MIRROR_NONE // 鏈湴杩滅閮戒笉闀滃儚锛屽悗缃粯璁わ紝鏈湴鍜岃繙绔创绾搁兘姝e父 } +/** + * Camera config + * + * @property frontMirror + * @property backMirror + * @constructor Create empty Camera config + */ data class CameraConfig( val frontMirror: MirrorMode = MirrorMode.MIRROR_LOCAL_REMOTE, // 鍓嶇疆榛樿闀滃儚锛氭湰鍦拌繙绔兘闀滃儚 val backMirror: MirrorMode = MirrorMode.MIRROR_NONE // 鍚庣疆榛樿闀滃儚锛氭湰鍦拌繙绔兘涓嶉暅鍍 ) +/** + * Config + * + * @property context + * @property rtcEngine + * @property renderManager + * @property eventCallback + * @property captureMode + * @property statsDuration + * @property statsEnable + * @property cameraConfig + * @constructor Create empty Config + */ data class Config( val context: Context, // Android Context涓婁笅鏂 val rtcEngine: RtcEngine, // 澹扮綉Rtc寮曟搸 @@ -89,24 +172,103 @@ data class Config( val cameraConfig: CameraConfig = CameraConfig() // 鎽勫儚澶撮暅鍍忛厤缃 ) +/** + * Error code + * + * @property value + * @constructor Create empty Error code + */ enum class ErrorCode(val value: Int) { + /** + * Error Ok + * + * @constructor Create empty Error Ok + */ ERROR_OK(0), // 涓鍒囨甯 + + /** + * Error Has Not Initialized + * + * @constructor Create empty Error Has Not Initialized + */ ERROR_HAS_NOT_INITIALIZED(101), // 娌℃湁璋冪敤Initialize鎴栬皟鐢ㄥけ璐ユ儏鍐典笅璋冪敤浜嗗叾浠朅PI + + /** + * Error Has Initialized + * + * @constructor Create empty Error Has Initialized + */ ERROR_HAS_INITIALIZED(102), // 宸茬粡Initialize鎴愬姛鍚庡啀娆¤皟鐢ㄦ姤閿 + + /** + * Error Has Released + * + * @constructor Create empty Error Has Released + */ ERROR_HAS_RELEASED(103), // 宸茬粡璋冪敤release閿姣佸悗杩樿皟鐢ㄥ叾浠朅PI + + /** + * Error Process Not Custom + * + * @constructor Create empty Error Process Not Custom + */ ERROR_PROCESS_NOT_CUSTOM(104), // 闈濩ustom澶勭悊妯″紡涓嬭皟鐢╫nFrame鎺ュ彛浠庡閮ㄤ紶鍏ヨ棰戝抚 + + /** + * Error Process Disable + * + * @constructor Create empty Error Process Disable + */ ERROR_PROCESS_DISABLE(105), // 褰撹皟鐢╡nable(false)绂佺敤缇庨鍚庤皟鐢╫nFrame鎺ュ彛杩斿洖 + + /** + * Error View Type Error + * + * @constructor Create empty Error View Type Error + */ ERROR_VIEW_TYPE_ERROR(106), // 褰撹皟鐢╯etupLocalVideo鏃秜iew绫诲瀷閿欒鏃惰繑鍥 + + /** + * Error Frame Skipped + * + * @constructor Create empty Error Frame Skipped + */ ERROR_FRAME_SKIPPED(107), // 褰撳鐞嗗抚蹇界暐鏃跺湪onFrame杩斿洖 } +/** + * Beauty preset + * + * @constructor Create empty Beauty preset + */ enum class BeautyPreset { + /** + * Custom + * + * @constructor Create empty Custom + */ CUSTOM, // 涓嶄娇鐢ㄦ帹鑽愮殑缇庨鍙傛暟 + + /** + * Default + * + * @constructor Create empty Default + */ DEFAULT // 榛樿鐨 } +/** + * Create byte dance beauty a p i + * + * @return + */ fun createByteDanceBeautyAPI(): ByteDanceBeautyAPI = ByteDanceBeautyAPIImpl() +/** + * Byte dance beauty a p i + * + * @constructor Create empty Byte dance beauty a p i + */ interface ByteDanceBeautyAPI { /** @@ -165,6 +327,11 @@ interface ByteDanceBeautyAPI { */ fun isFrontCamera(): Boolean + /** + * Get mirror applied + * + * @return + */ fun getMirrorApplied(): Boolean /** diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPIImpl.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPIImpl.kt index c72597056..fc055b6d9 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPIImpl.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/ByteDanceBeautyAPIImpl.kt @@ -47,6 +47,11 @@ import java.nio.ByteBuffer import java.util.concurrent.Callable import java.util.concurrent.Executors +/** + * Byte dance beauty a p i impl + * + * @constructor Create empty Byte dance beauty a p i impl + */ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { private val TAG = "ByteDanceBeautyAPIImpl" private val reportId = "scenarioAPI" @@ -72,9 +77,41 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { private var localVideoRenderMode = Constants.RENDER_MODE_HIDDEN private enum class BeautyProcessType{ - UNKNOWN, TEXTURE_OES, TEXTURE_2D, I420 + /** + * Unknown + * + * @constructor Create empty Unknown + */ + UNKNOWN, + + /** + * Texture Oes + * + * @constructor Create empty Texture Oes + */ + TEXTURE_OES, + + /** + * Texture 2d + * + * @constructor Create empty Texture 2d + */ + TEXTURE_2D, + + /** + * I420 + * + * @constructor Create empty I420 + */ + I420 } + /** + * Initialize + * + * @param config + * @return + */ override fun initialize(config: Config): Int { if (this.config != null) { LogUtils.e(TAG, "initialize >> The beauty api has been initialized!") @@ -94,6 +131,12 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Enable + * + * @param enable + * @return + */ override fun enable(enable: Boolean): Int { LogUtils.i(TAG, "enable >> enable = $enable") if (config == null) { @@ -113,6 +156,13 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Setup local video + * + * @param view + * @param renderMode + * @return + */ override fun setupLocalVideo(view: View, renderMode: Int): Int { val rtcEngine = config?.rtcEngine if(rtcEngine == null){ @@ -130,6 +180,12 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_VIEW_TYPE_ERROR.value } + /** + * On frame + * + * @param videoFrame + * @return + */ override fun onFrame(videoFrame: VideoFrame): Int { val conf = config if (conf == null) { @@ -154,6 +210,15 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_FRAME_SKIPPED.value } + /** + * Set beauty preset + * + * @param preset + * @param beautyNodePath + * @param beauty4ItemNodePath + * @param reSharpNodePath + * @return + */ override fun setBeautyPreset( preset: BeautyPreset, beautyNodePath: String, @@ -236,12 +301,24 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Set parameters + * + * @param key + * @param value + */ override fun setParameters(key: String, value: String) { when (key) { "beauty_mode" -> beautyMode = value.toInt() } } + /** + * Update camera config + * + * @param config + * @return + */ override fun updateCameraConfig(config: CameraConfig): Int { LogUtils.i(TAG, "updateCameraConfig >> oldCameraConfig=$cameraConfig, newCameraConfig=$config") cameraConfig = CameraConfig(config.frontMirror, config.backMirror) @@ -250,8 +327,17 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Is front camera + * + */ override fun isFrontCamera() = isFrontCamera + /** + * Release + * + * @return + */ override fun release(): Int { val conf = config if(conf == null){ @@ -533,7 +619,7 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { }) } - private fun getNV21Buffer(videoFrame: VideoFrame, rotate: Boolean = false): ByteArray? { + private fun getNV21Buffer(videoFrame: VideoFrame): ByteArray? { val buffer = videoFrame.buffer val i420Buffer = buffer as? I420Buffer ?: buffer.toI420() val width = i420Buffer.width @@ -562,29 +648,75 @@ class ByteDanceBeautyAPIImpl : ByteDanceBeautyAPI, IVideoFrameObserver { // IVideoFrameObserver implements + /** + * On capture video frame + * + * @param sourceType + * @param videoFrame + * @return + */ override fun onCaptureVideoFrame(sourceType: Int, videoFrame: VideoFrame?): Boolean { videoFrame ?: return false return processBeauty(videoFrame) } + /** + * On pre encode video frame + * + * @param sourceType + * @param videoFrame + */ override fun onPreEncodeVideoFrame(sourceType: Int, videoFrame: VideoFrame?) = false + /** + * On media player video frame + * + * @param videoFrame + * @param mediaPlayerId + */ override fun onMediaPlayerVideoFrame(videoFrame: VideoFrame?, mediaPlayerId: Int) = false + /** + * On render video frame + * + * @param channelId + * @param uid + * @param videoFrame + */ override fun onRenderVideoFrame( channelId: String?, uid: Int, videoFrame: VideoFrame? ) = false + /** + * Get video frame process mode + * + */ override fun getVideoFrameProcessMode() = IVideoFrameObserver.PROCESS_MODE_READ_WRITE + /** + * Get video format preference + * + */ override fun getVideoFormatPreference() = IVideoFrameObserver.VIDEO_PIXEL_DEFAULT + /** + * Get rotation applied + * + */ override fun getRotationApplied() = false + /** + * Get mirror applied + * + */ override fun getMirrorApplied() = captureMirror && !enable + /** + * Get observed frame position + * + */ override fun getObservedFramePosition() = IVideoFrameObserver.POSITION_POST_CAPTURER } \ No newline at end of file diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/AgoraImageHelper.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/AgoraImageHelper.kt index 37e068c90..609139091 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/AgoraImageHelper.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/AgoraImageHelper.kt @@ -30,10 +30,25 @@ import io.agora.base.internal.video.GlRectDrawer import io.agora.base.internal.video.GlTextureFrameBuffer import io.agora.base.internal.video.RendererCommon.GlDrawer +/** + * Agora image helper + * + * @constructor Create empty Agora image helper + */ class AgoraImageHelper { private var glFrameBuffer: GlTextureFrameBuffer? = null private var drawer : GlDrawer? = null + /** + * Transform texture + * + * @param texId + * @param texType + * @param width + * @param height + * @param transform + * @return + */ fun transformTexture( texId: Int, texType: VideoFrame.TextureBuffer.Type, @@ -66,6 +81,10 @@ class AgoraImageHelper { return frameBuffer.textureId } + /** + * Release + * + */ fun release() { glFrameBuffer?.release() glFrameBuffer = null diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/GLTestUtils.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/GLTestUtils.java index 3d17f9f74..70cb19208 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/GLTestUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/GLTestUtils.java @@ -37,9 +37,23 @@ import java.nio.ByteBuffer; import java.nio.IntBuffer; -public class GLTestUtils { +/** + * The type Gl test utils. + */ +public final class GLTestUtils { private static final String TAG = "GLUtils"; + private GLTestUtils() { + } + + /** + * Gets texture 2 d image. + * + * @param textureID the texture id + * @param width the width + * @param height the height + * @return the texture 2 d image + */ public static Bitmap getTexture2DImage(int textureID, int width, int height) { try { int[] oldFboId = new int[1]; @@ -81,6 +95,14 @@ public static Bitmap getTexture2DImage(int textureID, int width, int height) { return null; } + /** + * Gets texture oes image. + * + * @param textureID the texture id + * @param width the width + * @param height the height + * @return the texture oes image + */ public static Bitmap getTextureOESImage(int textureID, int width, int height) { try { int[] oldFboId = new int[1]; @@ -122,6 +144,14 @@ public static Bitmap getTextureOESImage(int textureID, int width, int height) { return null; } + /** + * Nv 21 to bitmap bitmap. + * + * @param nv21 the nv 21 + * @param width the width + * @param height the height + * @return the bitmap + */ public static Bitmap nv21ToBitmap(byte[] nv21, int width, int height) { Bitmap bitmap = null; try { @@ -136,7 +166,7 @@ public static Bitmap nv21ToBitmap(byte[] nv21, int width, int height) { return bitmap; } - private static Bitmap readBitmap(int width, int height){ + private static Bitmap readBitmap(int width, int height) { ByteBuffer rgbaBuf = ByteBuffer.allocateDirect(width * height * 4); rgbaBuf.position(0); GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, rgbaBuf); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/ImageUtil.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/ImageUtil.java index 8d6b7ebd8..c324956da 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/ImageUtil.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/ImageUtil.java @@ -30,6 +30,7 @@ import android.graphics.Point; import android.opengl.GLES20; import android.opengl.Matrix; +import android.util.Log; import android.widget.ImageView; import com.bytedance.labcv.effectsdk.BytedEffectConstants; @@ -46,73 +47,69 @@ public class ImageUtil { private static final String TAG = "ImageUtil"; + /** + * The M frame buffers. + */ protected int[] mFrameBuffers; + /** + * The M frame buffer textures. + */ protected int[] mFrameBufferTextures; - protected int FRAME_BUFFER_NUM = 1; + /** + * The Frame buffer num. + */ + protected int frameBufferNum = 1; + /** + * The M frame buffer shape. + */ protected Point mFrameBufferShape; private ProgramManager mProgramManager; - - - /** {zh} - * 榛樿鏋勯犲嚱鏁 - */ - /** {en} + /** + * {en} * Default constructor + *

    + * {zh} + * 榛樿鏋勯犲嚱鏁 */ - public ImageUtil() { } - /** {zh} + /** + * {zh} * 鍑嗗甯х紦鍐插尯绾圭悊瀵硅薄 * * @param width 绾圭悊瀹藉害 * @param height 绾圭悊楂樺害 - * @return 绾圭悊ID - */ - /** {en} - * Prepare frame buffer texture object - * - * @param width texture width - * @param height texture height - * @return texture ID + * @return 绾圭悊ID int

    {en} Prepare frame buffer texture object */ - public int prepareTexture(int width, int height) { initFrameBufferIfNeed(width, height); return mFrameBufferTextures[0]; } - /** {zh} + /** + * {zh} * 榛樿鐨勭灞忔覆鏌撶粦瀹氱殑绾圭悊 - * @return 绾圭悊id - */ - /** {en} - * Default off-screen rendering bound texture - * @return texture id + * + * @return 绾圭悊id output texture

    {en} Default off-screen rendering bound texture */ - public int getOutputTexture() { - if (mFrameBufferTextures == null) return GlUtil.NO_TEXTURE; + if (mFrameBufferTextures == null) { + return GlUtil.NO_TEXTURE; + } return mFrameBufferTextures[0]; } - /** {zh} + /** + * {zh} * 鍒濆鍖栧抚缂撳啿鍖 * * @param width 缂撳啿鐨勭汗鐞嗗搴 * @param height 缂撳啿鐨勭汗鐞嗛珮搴 */ - /** {en} - * Initialize frame buffer - * - * @param width buffered texture width - * @param height buffered texture height - */ - private void initFrameBufferIfNeed(int width, int height) { boolean need = false; if (null == mFrameBufferShape || mFrameBufferShape.x != width || mFrameBufferShape.y != height) { @@ -123,11 +120,11 @@ private void initFrameBufferIfNeed(int width, int height) { } if (need) { destroyFrameBuffers(); - mFrameBuffers = new int[FRAME_BUFFER_NUM]; - mFrameBufferTextures = new int[FRAME_BUFFER_NUM]; - GLES20.glGenFramebuffers(FRAME_BUFFER_NUM, mFrameBuffers, 0); - GLES20.glGenTextures(FRAME_BUFFER_NUM, mFrameBufferTextures, 0); - for (int i = 0; i < FRAME_BUFFER_NUM; i++) { + mFrameBuffers = new int[frameBufferNum]; + mFrameBufferTextures = new int[frameBufferNum]; + GLES20.glGenFramebuffers(frameBufferNum, mFrameBuffers, 0); + GLES20.glGenTextures(frameBufferNum, mFrameBufferTextures, 0); + for (int i = 0; i < frameBufferNum; i++) { bindFrameBuffer(mFrameBufferTextures[i], mFrameBuffers[i], width, height); } mFrameBufferShape = new Point(width, height); @@ -135,35 +132,21 @@ private void initFrameBufferIfNeed(int width, int height) { } - /** {zh} + /** + * {zh} * 閿姣佸抚缂撳啿鍖哄璞 */ - /** {en} - * Destroy frame buffer objects - */ - private void destroyFrameBuffers() { if (mFrameBufferTextures != null) { - GLES20.glDeleteTextures(FRAME_BUFFER_NUM, mFrameBufferTextures, 0); + GLES20.glDeleteTextures(frameBufferNum, mFrameBufferTextures, 0); mFrameBufferTextures = null; } if (mFrameBuffers != null) { - GLES20.glDeleteFramebuffers(FRAME_BUFFER_NUM, mFrameBuffers, 0); + GLES20.glDeleteFramebuffers(frameBufferNum, mFrameBuffers, 0); mFrameBuffers = null; } } - /** {zh} - * 绾圭悊鍙傛暟璁剧疆+buffer缁戝畾 - * set texture params - * and bind buffer - */ - /** {en} - * Texture parameter setting + buffer binding - * set texture params - * and binding buffer - */ - private void bindFrameBuffer(int textureId, int frameBuffer, int width, int height) { GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, @@ -186,14 +169,15 @@ private void bindFrameBuffer(int textureId, int frameBuffer, int width, int heig } - - /** {zh} + /** + * {zh} * 閲婃斁璧勬簮锛屽寘鎷抚缂撳啿鍖哄強Program瀵硅薄 - */ - /** {en} + *

    + * {en} * Free resources, including frame buffers and Program objects + * {zh} + * 閲婃斁璧勬簮锛屽寘鎷抚缂撳啿鍖哄強Program瀵硅薄 */ - public void release() { destroyFrameBuffers(); if (null != mProgramManager) { @@ -205,23 +189,18 @@ public void release() { } } - /** {zh} + /** + * {zh} * 璇诲彇娓叉煋缁撴灉鐨刡uffer * * @param imageWidth 鍥惧儚瀹藉害 * @param imageHeight 鍥惧儚楂樺害 - * @return 娓叉煋缁撴灉鐨勫儚绱燘uffer 鏍煎紡RGBA + * @return 娓叉煋缁撴灉鐨勫儚绱燘uffer 鏍煎紡RGBA

    {en} Read the buffer */ - /** {en} - * Read the buffer - * - * @param imageWidth image width - * @param imageHeight image height - * @return pixel Buffer format of the rendered result RGBA - */ - public ByteBuffer captureRenderResult(int imageWidth, int imageHeight) { - if (mFrameBufferTextures == null) return null; + if (mFrameBufferTextures == null) { + return null; + } int textureId = mFrameBufferTextures[0]; if (null == mFrameBufferTextures || textureId == GlUtil.NO_TEXTURE) { return null; @@ -258,21 +237,15 @@ public ByteBuffer captureRenderResult(int imageWidth, int imageHeight) { return mCaptureBuffer; } - /** {zh} + /** + * {zh} * 璇诲彇娓叉煋缁撴灉鐨刡uffer * + * @param textureId the texture id * @param imageWidth 鍥惧儚瀹藉害 * @param imageHeight 鍥惧儚楂樺害 - * @return 娓叉煋缁撴灉鐨勫儚绱燘uffer 鏍煎紡RGBA - */ - /** {en} - * Read the buffer - * - * @param imageWidth image width - * @param imageHeight image height - * @return pixel Buffer format of the rendered result RGBA + * @return 娓叉煋缁撴灉鐨勫儚绱燘uffer 鏍煎紡RGBA

    {en} Read the buffer */ - public ByteBuffer captureRenderResult(int textureId, int imageWidth, int imageHeight) { if (textureId == GlUtil.NO_TEXTURE) { return null; @@ -309,25 +282,16 @@ public ByteBuffer captureRenderResult(int textureId, int imageWidth, int imageHe return mCaptureBuffer; } - /** {zh} + /** + * {zh} * 绾圭悊鎷疯礉 * - * @param srcTexture - * @param dstTexture - * @param width - * @param height - * @return + * @param srcTexture the src texture + * @param dstTexture the dst texture + * @param width the width + * @param height the height + * @return boolean

    {en} Texture copy */ - /** {en} - * Texture copy - * - * @param srcTexture - * @param dstTexture - * @param width - * @param height - * @return - */ - public boolean copyTexture(int srcTexture, int dstTexture, int width, int height) { if (srcTexture == GlUtil.NO_TEXTURE || dstTexture == GlUtil.NO_TEXTURE) { return false; @@ -351,99 +315,99 @@ public boolean copyTexture(int srcTexture, int dstTexture, int width, int height int error = GLES20.glGetError(); if (error != GLES20.GL_NO_ERROR) { String msg = "copyTexture glError 0x" + Integer.toHexString(error); + Log.e(TAG, msg); return false; } return true; - - } - /** {zh} + /** + * {zh} + * * @param inputTexture 杈撳叆绾圭悊 * @param inputTextureFormat 杈撳叆绾圭悊鏍煎紡锛2D/OES * @param outputTextureFormat 杈撳嚭绾圭悊鏍煎紡锛2D/OES * @param width 杈撳叆绾圭悊鐨勫 * @param height 杈撳叆绾圭悊鐨勯珮 * @param transition 绾圭悊鍙樻崲鏂瑰紡 - * @return 杈撳嚭绾圭悊 - * @brief 绾圭悊杞汗鐞 + * @return 杈撳嚭绾圭悊 int */ - /** {en} - * @param inputTextureFormat input texture format, 2D/OES - * @param outputTextureFormat output texture format, 2D/OES - * @param width input texture width - * @param height input texture height - * @param transition texture transformation mode - * @return output texture - * @brief texture to texture - */ - - public int transferTextureToTexture(int inputTexture, BytedEffectConstants.TextureFormat inputTextureFormat, - BytedEffectConstants.TextureFormat outputTextureFormat, - int width, int height, Transition transition) { - if (outputTextureFormat != BytedEffectConstants.TextureFormat.Texure2D){ - LogUtils.e(TAG, "the inputTexture is not supported,please use Texure2D as output texture format"); - return GlUtil.NO_TEXTURE; - } + public int transferTextureToTexture(int inputTexture, BytedEffectConstants.TextureFormat inputTextureFormat, + BytedEffectConstants.TextureFormat outputTextureFormat, + int width, int height, Transition transition) { + if (outputTextureFormat != BytedEffectConstants.TextureFormat.Texure2D) { + LogUtils.e(TAG, "the inputTexture is not supported,please use Texure2D as output texture format"); + return GlUtil.NO_TEXTURE; + } if (null == mProgramManager) { mProgramManager = new ProgramManager(); } - boolean targetRoated = (transition.getAngle()%180 ==90); - return mProgramManager.getProgram(inputTextureFormat).drawFrameOffScreen(inputTexture, targetRoated?height:width, targetRoated?width:height, transition.getMatrix()); + boolean targetRoated = transition.getAngle() % 180 == 90; + return mProgramManager.getProgram(inputTextureFormat).drawFrameOffScreen(inputTexture, targetRoated ? height : width, targetRoated ? width : height, transition.getMatrix()); } private ProgramTextureYUV mYUVProgram; + + /** + * Transfer yuv to texture int. + * + * @param yBuffer the y buffer + * @param vuBuffer the vu buffer + * @param width the width + * @param height the height + * @param transition the transition + * @return the int + */ public int transferYUVToTexture(ByteBuffer yBuffer, ByteBuffer vuBuffer, int width, int height, Transition transition) { if (mYUVProgram == null) { mYUVProgram = new ProgramTextureYUV(); } int yTexture = GlUtil.createImageTexture(yBuffer, width, height, GLES20.GL_ALPHA); - int vuTexture = GlUtil.createImageTexture(vuBuffer, width/2, height/2, GLES20.GL_LUMINANCE_ALPHA); + int vuTexture = GlUtil.createImageTexture(vuBuffer, width / 2, height / 2, GLES20.GL_LUMINANCE_ALPHA); int rgbaTexture = mYUVProgram.drawFrameOffScreen(yTexture, vuTexture, width, height, transition.getMatrix()); GlUtil.deleteTextureId(new int[]{yTexture, vuTexture}); return rgbaTexture; } - /** {zh} - * @param texture 绾圭悊 + /** + * {zh} + * + * @param texture 绾圭悊 * @param inputTextureFormat 绾圭悊鏍煎紡锛2D/OES - * @param outputFormat 杈撳嚭 buffer 鏍煎紡 - * @param width 瀹 - * @param height 楂 + * @param outputFormat 杈撳嚭 buffer 鏍煎紡 + * @param width 瀹 + * @param height 楂 + * @param ratio the ratio * @return 杈撳嚭 buffer - * @brief 绾圭悊杞 buffer */ - /** {en} - * @param inputTextureFormat texture format, 2D/OES - * @param outputFormat output buffer format - * @param width width - * @param height height - * @return output buffer - * @brief texture turn buffer - */ - public ByteBuffer transferTextureToBuffer(int texture, BytedEffectConstants.TextureFormat inputTextureFormat, - BytedEffectConstants.PixlFormat outputFormat, int width, int height, float ratio){ - if (outputFormat != BytedEffectConstants.PixlFormat.RGBA8888){ + BytedEffectConstants.PixlFormat outputFormat, int width, int height, float ratio) { + if (outputFormat != BytedEffectConstants.PixlFormat.RGBA8888) { LogUtils.e(TAG, "the outputFormat is not supported,please use RGBA8888 as output texture format"); - return null; + return null; } if (null == mProgramManager) { mProgramManager = new ProgramManager(); } - return mProgramManager.getProgram(inputTextureFormat).readBuffer(texture, (int) (width*ratio), (int)(height*ratio)); - - - + return mProgramManager.getProgram(inputTextureFormat).readBuffer(texture, (int) (width * ratio), (int) (height * ratio)); } + /** + * Transfer texture to bitmap bitmap. + * + * @param texture the texture + * @param inputTextureFormat the input texture format + * @param width the width + * @param height the height + * @return the bitmap + */ public Bitmap transferTextureToBitmap(int texture, BytedEffectConstants.TextureFormat inputTextureFormat, int width, int height) { ByteBuffer buffer = transferTextureToBuffer(texture, inputTextureFormat, BytedEffectConstants.PixlFormat.RGBA8888, @@ -454,38 +418,30 @@ public Bitmap transferTextureToBitmap(int texture, BytedEffectConstants.TextureF return transferBufferToBitmap(buffer, BytedEffectConstants.PixlFormat.RGBA8888, width, height); } - /** {zh} + /** + * {zh} + * * @param buffer 杈撳叆 buffer * @param inputFormat buffer 鏍煎紡 * @param outputFormat 杈撳嚭绾圭悊鏍煎紡 * @param width 瀹 * @param height 楂 - * @return 杈撳嚭绾圭悊 - * @brief buffer 杞汗鐞 - */ - /** {en} - * @param inputFormat buffer format - * @param outputFormat output texture format - * @param width width - * @param height height - * @return output texture - * @brief buffer turn texture + * @return 杈撳嚭绾圭悊 int */ - public int transferBufferToTexture(ByteBuffer buffer, BytedEffectConstants.PixlFormat inputFormat, - BytedEffectConstants.TextureFormat outputFormat, int width, int height){ + BytedEffectConstants.TextureFormat outputFormat, int width, int height) { - if (inputFormat != BytedEffectConstants.PixlFormat.RGBA8888){ + if (inputFormat != BytedEffectConstants.PixlFormat.RGBA8888) { LogUtils.e(TAG, "inputFormat support RGBA8888 only"); return GlUtil.NO_TEXTURE; } - if (outputFormat != BytedEffectConstants.TextureFormat.Texure2D){ + if (outputFormat != BytedEffectConstants.TextureFormat.Texure2D) { LogUtils.e(TAG, "outputFormat support Texure2D only"); return GlUtil.NO_TEXTURE; } - return create2DTexture(buffer, width,height, GL_RGBA); + return create2DTexture(buffer, width, height, GL_RGBA); } @@ -521,48 +477,33 @@ private int create2DTexture(ByteBuffer data, int width, int height, int format) return textureHandle; } - /** {zh} + /** + * {zh} + * * @param buffer 杈撳叆 buffer * @param inputFormat 杈撳叆 buffer 鏍煎紡 * @param outputFormat 杈撳嚭 buffer 鏍煎紡 * @param width 瀹 * @param height 楂 * @return 杈撳嚭 buffer - * @brief buffer 杞 buffer - */ - /** {en} - * @param inputFormat input buffer format - * @param outputFormat output buffer format - * @param width width - * @param height height - * @return output buffer - * @brief buffer to buffer */ - public ByteBuffer transferBufferToBuffer(ByteBuffer buffer, BytedEffectConstants.PixlFormat inputFormat, - BytedEffectConstants.PixlFormat outputFormat, int width, int height){ + BytedEffectConstants.PixlFormat outputFormat, int width, int height) { return null; } - /** {zh} + /** + * {zh} + * * @param buffer 杈撳叆 buffer * @param format 杈撳叆 buffer 鏍煎紡 * @param width 瀹 * @param height 楂 * @return 杈撳嚭 bitmap - * @brief buffer 杞 bitmap - */ - /** {en} - * @param format input buffer format - * @param width width - * @param height height - * @return output bitmap - * @brief buffer turn bitmap */ - public Bitmap transferBufferToBitmap(ByteBuffer buffer, BytedEffectConstants.PixlFormat format, - int width, int height){ + int width, int height) { Bitmap mCameraBitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888); buffer.position(0); @@ -572,24 +513,17 @@ public Bitmap transferBufferToBitmap(ByteBuffer buffer, BytedEffectConstants.Pix } - /** {zh} + /** + * {zh} * 鍦ㄥ睆骞曚笂娓叉煋绾圭悊 - * @param textureId 绾圭悊ID + * + * @param textureId 绾圭悊ID * @param srcTetxureFormat 绾圭悊鏍煎紡 - * @param surfaceWidth 瑙嗗彛瀹藉害 - * @param surfaceHeight 瑙嗗彛楂樺害 - * @param mMVPMatrix 鏃嬭浆鐭╅樀 - */ - /** {en} - * Render texture on screen - * @param textureId texture ID - * @param srcTetxureFormat texture format - * @param surfaceWidth viewport width - * @param surfaceHeight viewport height - * @param mMVPMatrix rotation matrix + * @param surfaceWidth 瑙嗗彛瀹藉害 + * @param surfaceHeight 瑙嗗彛楂樺害 + * @param mMVPMatrix 鏃嬭浆鐭╅樀

    {en} Render texture on screen */ - - public void drawFrameOnScreen(int textureId,BytedEffectConstants.TextureFormat srcTetxureFormat,int surfaceWidth, int surfaceHeight, float[]mMVPMatrix) { + public void drawFrameOnScreen(int textureId, BytedEffectConstants.TextureFormat srcTetxureFormat, int surfaceWidth, int surfaceHeight, float[] mMVPMatrix) { if (null == mProgramManager) { mProgramManager = new ProgramManager(); } @@ -599,22 +533,26 @@ public void drawFrameOnScreen(int textureId,BytedEffectConstants.TextureFormat s } - /** {zh} - * @brief 鍙樻崲鏂瑰紡绫 - */ - /** {en} - * @brief Transform mode class + /** + * The type Transition. */ - public static class Transition { private float[] mMVPMatrix = new float[16]; private int mAngle = 0; + /** + * Instantiates a new Transition. + */ public Transition() { Matrix.setIdentityM(mMVPMatrix, 0); } + /** + * Instantiates a new Transition. + * + * @param transformMatrixArray the transform matrix array + */ public Transition(float[] transformMatrixArray) { for (int i = 0; i < transformMatrixArray.length; i++) { mMVPMatrix[i] = transformMatrixArray[i]; @@ -622,31 +560,34 @@ public Transition(float[] transformMatrixArray) { } - /** {zh} - * @brief 闀滃儚 + /** + * {zh} + * + * @param x the x + * @param y the y + * @return the transition */ - /** {en} - * @brief Mirror image - */ - public Transition flip(boolean x, boolean y) { GlUtil.flip(mMVPMatrix, x, y); return this; } + /** + * Gets angle. + * + * @return the angle + */ public int getAngle() { - return mAngle%360; + return mAngle % 360; } - /** {zh} + /** + * {zh} + * * @param angle 鏃嬭浆瑙掑害锛屼粎鏀寔 0/90/180/270 - * @brief 鏃嬭浆 + * @return the transition */ - /** {en} - * @brief rotation - */ - public Transition rotate(float angle) { mAngle += angle; GlUtil.rotate(mMVPMatrix, angle); @@ -654,34 +595,44 @@ public Transition rotate(float angle) { } - public Transition scale(float sx,float sy) { - GlUtil.scale(mMVPMatrix, sx , sy); + /** + * Scale transition. + * + * @param sx the sx + * @param sy the sy + * @return the transition + */ + public Transition scale(float sx, float sy) { + GlUtil.scale(mMVPMatrix, sx, sy); return this; } - public Transition crop(ImageView.ScaleType scaleType, int rotation, int textureWidth, int textureHeight, int surfaceWidth, int surfaceHeight){ - if (rotation % 180 == 90){ - GlUtil.getShowMatrix(mMVPMatrix,scaleType, textureHeight, textureWidth, surfaceWidth, surfaceHeight); - }else { - GlUtil.getShowMatrix(mMVPMatrix,scaleType, textureWidth, textureHeight, surfaceWidth, surfaceHeight); + /** + * Crop transition. + * + * @param scaleType the scale type + * @param rotation the rotation + * @param textureWidth the texture width + * @param textureHeight the texture height + * @param surfaceWidth the surface width + * @param surfaceHeight the surface height + * @return the transition + */ + public Transition crop(ImageView.ScaleType scaleType, int rotation, int textureWidth, int textureHeight, int surfaceWidth, int surfaceHeight) { + if (rotation % 180 == 90) { + GlUtil.getShowMatrix(mMVPMatrix, scaleType, textureHeight, textureWidth, surfaceWidth, surfaceHeight); + } else { + GlUtil.getShowMatrix(mMVPMatrix, scaleType, textureWidth, textureHeight, surfaceWidth, surfaceHeight); } return this; } - /** {zh} + /** + * {zh} + * * @return 閫嗗悜鍚庣殑 transition - * @brief 閫嗗悜鐢熸垚鏂扮殑 transition - * @details 鍙樻崲鎿嶄綔鏈夐『搴忎箣鍒嗭紝鏈柟娉曞彲浠ュ皢涓绯诲垪鎿嶄綔閫嗗簭锛 - * 濡傚皢鍏堥暅鍍忓啀鏃嬭浆锛岄嗗簭涓哄厛鏃嬭浆鍐嶉暅鍍 - */ - /** {en} - * @return Reverse transition - * @brief Reverse generation of new transition - * @details transformation operations can be divided into sequence. This method can reverse a series of operations, - * such as mirroring first and then rotating, and the reverse order is rotating first and then mirroring */ - public Transition reverse() { float[] invertedMatrix = new float[16]; @@ -693,13 +644,23 @@ public Transition reverse() { } - public float[] getMatrix(){ + /** + * Get matrix float [ ]. + * + * @return the float [ ] + */ + public float[] getMatrix() { return mMVPMatrix; } - public String toString(){ - StringBuilder sb =new StringBuilder(); - for (float value: mMVPMatrix){ + /** + * To String. + * + * @return string + */ + public String toString() { + StringBuilder sb = new StringBuilder(); + for (float value : mMVPMatrix) { sb.append(value).append(" "); } return sb.toString(); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/LogUtils.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/LogUtils.kt index e44fbb732..2ee86d9a5 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/LogUtils.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/LogUtils.kt @@ -32,6 +32,11 @@ import java.util.Date import java.util.Locale import java.util.concurrent.Executors +/** + * Log utils + * + * @constructor Create empty Log utils + */ object LogUtils { private const val beautyType = "ByteDance" private val timeFormat = SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS", Locale.ROOT) @@ -39,6 +44,11 @@ object LogUtils { private val workerThread = Executors.newSingleThreadExecutor() private var logOutputStream: FileOutputStream? = null + /** + * Set log file path + * + * @param path + */ @JvmStatic fun setLogFilePath(path: String){ if(path.isEmpty()){ @@ -58,6 +68,13 @@ object LogUtils { } + /** + * I + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun i(tag: String, content: String, vararg args: Any) { val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -66,6 +83,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * D + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun d(tag: String, content: String, vararg args: Any) { val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -74,6 +98,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * W + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun w(tag: String, content: String, vararg args: Any){ val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -82,6 +113,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * E + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun e(tag: String, content: String, vararg args: Any){ val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/StatsHelper.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/StatsHelper.kt index 2f2abbe98..b4399ed7e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/StatsHelper.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/StatsHelper.kt @@ -30,6 +30,13 @@ import io.agora.beautyapi.bytedance.BeautyStats import kotlin.math.max import kotlin.math.min +/** + * Stats helper + * + * @property statsDuration + * @property onStatsChanged + * @constructor Create empty Stats helper + */ class StatsHelper( private val statsDuration: Long, private val onStatsChanged: (BeautyStats) -> Unit @@ -41,6 +48,11 @@ class StatsHelper( private var mCostMax = 0L private var mCostMin = Long.MAX_VALUE + /** + * Once + * + * @param cost + */ fun once(cost: Long) { val curr = System.currentTimeMillis() if (mStartTime == 0L) { @@ -68,6 +80,10 @@ class StatsHelper( mCostMin = min(mCostMin, cost) } + /** + * Reset + * + */ fun reset() { mMainHandler.removeCallbacksAndMessages(null) mStartTime = 0 diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Drawable2d.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Drawable2d.java index 0e5e13c74..e807db10a 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Drawable2d.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Drawable2d.java @@ -32,24 +32,33 @@ */ public class Drawable2d { private static final int SIZEOF_FLOAT = 4; + /** + * The constant COORDS_PER_VERTEX. + */ public static final int COORDS_PER_VERTEX = 2; + /** + * The constant TEXTURE_COORD_STRIDE. + */ public static final int TEXTURE_COORD_STRIDE = COORDS_PER_VERTEX * SIZEOF_FLOAT; + /** + * The constant VERTEXTURE_STRIDE. + */ public static final int VERTEXTURE_STRIDE = COORDS_PER_VERTEX * SIZEOF_FLOAT; /** * Simple equilateral triangle (1.0 per side). Centered on (0,0). */ - private static final float TRIANGLE_COORDS[] = { - 0.0f, 0.577350269f, // 0 top - -0.5f, -0.288675135f, // 1 bottom left - 0.5f, -0.288675135f // 2 bottom right + private static final float[] TRIANGLE_COORDS = { + 0.0f, 0.577350269f, // 0 top + -0.5f, -0.288675135f, // 1 bottom left + 0.5f, -0.288675135f // 2 bottom right }; - private static final float TRIANGLE_TEX_COORDS[] = { - 0.5f, 0.0f, // 0 top center - 0.0f, 1.0f, // 1 bottom left - 1.0f, 1.0f, // 2 bottom right + private static final float[] TRIANGLE_TEX_COORDS = { + 0.5f, 0.0f, // 0 top center + 0.0f, 1.0f, // 1 bottom left + 1.0f, 1.0f, // 2 bottom right }; private static final FloatBuffer TRIANGLE_BUF = GlUtil.createFloatBuffer(TRIANGLE_COORDS); @@ -62,28 +71,25 @@ public class Drawable2d { *

    * Triangles are 0-1-2 and 2-1-3 (counter-clockwise winding). */ - private static final float RECTANGLE_COORDS[] = { - -0.5f, -0.5f, // 0 bottom left - 0.5f, -0.5f, // 1 bottom right - -0.5f, 0.5f, // 2 top left - 0.5f, 0.5f, // 3 top right + private static final float[] RECTANGLE_COORDS = { + -0.5f, -0.5f, // 0 bottom left + 0.5f, -0.5f, // 1 bottom right + -0.5f, 0.5f, // 2 top left + 0.5f, 0.5f, // 3 top right }; - /** {zh} - * FrameBuffer 涓庡睆骞曠殑鍧愭爣绯绘槸鍨傜洿闀滃儚鐨勶紝鎵浠ュ湪灏嗙汗鐞嗙粯鍒跺埌涓涓 FrameBuffer 鎴栧睆骞曚笂 - * 鐨勬椂鍊欙紝浠栦滑鐢ㄧ殑绾圭悊椤剁偣鍧愭爣鏄笉鍚岀殑锛岄渶瑕佹敞鎰忋 - */ - /** {en} + /** * The coordinate system of the FrameBuffer and the screen is mirrored vertically, so when drawing the texture to a FrameBuffer or screen * , the vertex coordinates of the texture they use are different, which needs attention. + *

    FrameBuffer 涓庡睆骞曠殑鍧愭爣绯绘槸鍨傜洿闀滃儚鐨勶紝鎵浠ュ湪灏嗙汗鐞嗙粯鍒跺埌涓涓 FrameBuffer 鎴栧睆骞曚笂 + * 鐨勬椂鍊欙紝浠栦滑鐢ㄧ殑绾圭悊椤剁偣鍧愭爣鏄笉鍚岀殑锛岄渶瑕佹敞鎰忋 */ - - private static final float RECTANGLE_TEX_COORDS[] = { + private static final float[] RECTANGLE_TEX_COORDS = { 0.0f, 1.0f, // 0 bottom left 1.0f, 1.0f, // 1 bottom right 0.0f, 0.0f, // 2 top left 1.0f, 0.0f // 3 top right }; - private static final float RECTANGLE_TEX_COORDS1[] = { + private static final float[] RECTANGLE_TEX_COORDS1 = { 0.0f, 0.0f, // 0 bottom left 1.0f, 0.0f, // 1 bottom right 0.0f, 1.0f, // 2 top left @@ -103,33 +109,31 @@ public class Drawable2d { * The texture coordinates are Y-inverted relative to RECTANGLE. (This seems to work out * right with external textures from SurfaceTexture.) */ - private static final float FULL_RECTANGLE_COORDS[] = { - -1.0f, -1.0f, // 0 bottom left - 1.0f, -1.0f, // 1 bottom right - -1.0f, 1.0f, // 2 top left - 1.0f, 1.0f, // 3 top right + private static final float[] FULL_RECTANGLE_COORDS = { + -1.0f, -1.0f, // 0 bottom left + 1.0f, -1.0f, // 1 bottom right + -1.0f, 1.0f, // 2 top left + 1.0f, 1.0f, // 3 top right }; - /** {zh} - * FrameBuffer 涓庡睆骞曠殑鍧愭爣绯绘槸鍨傜洿闀滃儚鐨勶紝鎵浠ュ湪灏嗙汗鐞嗙粯鍒跺埌涓涓 FrameBuffer 鎴栧睆骞曚笂 - * 鐨勬椂鍊欙紝浠栦滑鐢ㄧ殑绾圭悊椤剁偣鍧愭爣鏄笉鍚岀殑锛岄渶瑕佹敞鎰忋 - */ - /** {en} + + /** * The coordinate system of the FrameBuffer and the screen is mirrored vertically, so when drawing the texture to a FrameBuffer or screen * , the vertex coordinates of the texture they use are different, which needs attention. + *

    FrameBuffer 涓庡睆骞曠殑鍧愭爣绯绘槸鍨傜洿闀滃儚鐨勶紝鎵浠ュ湪灏嗙汗鐞嗙粯鍒跺埌涓涓 FrameBuffer 鎴栧睆骞曚笂 + * 鐨勬椂鍊欙紝浠栦滑鐢ㄧ殑绾圭悊椤剁偣鍧愭爣鏄笉鍚岀殑锛岄渶瑕佹敞鎰忋 */ - - private static final float FULL_RECTANGLE_TEX_COORDS[] = { - 0.0f, 1.0f, // 0 bottom left - 1.0f, 1.0f, // 1 bottom right - 0.0f, 0.0f, // 2 top left - 1.0f, 0.0f // 3 top right + private static final float[] FULL_RECTANGLE_TEX_COORDS = { + 0.0f, 1.0f, // 0 bottom left + 1.0f, 1.0f, // 1 bottom right + 0.0f, 0.0f, // 2 top left + 1.0f, 0.0f // 3 top right }; - private static final float FULL_RECTANGLE_TEX_COORDS1[] = { - 0.0f, 0.0f, // 0 bottom left - 1.0f, 0.0f, // 1 bottom right - 0.0f, 1.0f, // 2 top left - 1.0f, 1.0f // 3 top right + private static final float[] FULL_RECTANGLE_TEX_COORDS1 = { + 0.0f, 0.0f, // 0 bottom left + 1.0f, 0.0f, // 1 bottom right + 0.0f, 1.0f, // 2 top left + 1.0f, 1.0f // 3 top right }; private static final FloatBuffer FULL_RECTANGLE_BUF = GlUtil.createFloatBuffer(FULL_RECTANGLE_COORDS); @@ -152,13 +156,26 @@ public class Drawable2d { * Enum values for constructor. */ public enum Prefab { - TRIANGLE, RECTANGLE, FULL_RECTANGLE + /** + * Triangle prefab. + */ + TRIANGLE, + /** + * Rectangle prefab. + */ + RECTANGLE, + /** + * Full rectangle prefab. + */ + FULL_RECTANGLE } /** * Prepares a drawable from a "pre-fabricated" shape definition. *

    * Does no EGL/GL operations, so this can be done at any time. + * + * @param shape the shape */ public Drawable2d(Prefab shape) { switch (shape) { @@ -197,6 +214,8 @@ public Drawable2d(Prefab shape) { * Returns the array of vertices. *

    * To avoid allocations, this returns internal state. The caller must not modify it. + * + * @return the vertex array */ public FloatBuffer getVertexArray() { return mVertexArray; @@ -206,24 +225,27 @@ public FloatBuffer getVertexArray() { * Returns the array of texture coordinates. *

    * To avoid allocations, this returns internal state. The caller must not modify it. + * + * @return the tex coord array */ public FloatBuffer getTexCoordArray() { return mTexCoordArray; } - /** {zh} - * @brief 杩斿洖 frameBuffer 缁樺埗鐢 texture coordinates - */ - /** {en} - * @brief Returns texture coordinates for drawing frameBuffer - */ + /** + * Gets tex coor array fb. + * + * @return the tex coor array fb + */ public FloatBuffer getTexCoorArrayFB() { return mTexCoordArrayFB; } /** * Returns the number of vertices stored in the vertex array. + * + * @return the vertex count */ public int getVertexCount() { return mVertexCount; @@ -231,6 +253,8 @@ public int getVertexCount() { /** * Returns the width, in bytes, of the data for each vertex. + * + * @return the vertex stride */ public int getVertexStride() { return mVertexStride; @@ -238,6 +262,8 @@ public int getVertexStride() { /** * Returns the width, in bytes, of the data for each texture coordinate. + * + * @return the tex coord stride */ public int getTexCoordStride() { return mTexCoordStride; @@ -245,20 +271,37 @@ public int getTexCoordStride() { /** * Returns the number of position coordinates per vertex. This will be 2 or 3. + * + * @return the coords per vertex */ public int getCoordsPerVertex() { return mCoordsPerVertex; } - public void updateVertexArray(float[] FULL_RECTANGLE_COORDS) { - mVertexArray = GlUtil.createFloatBuffer(FULL_RECTANGLE_COORDS); - mVertexCount = FULL_RECTANGLE_COORDS.length / COORDS_PER_VERTEX; + /** + * Update vertex array. + * + * @param fullRectangleCoords the full rectangle coords + */ + public void updateVertexArray(float[] fullRectangleCoords) { + mVertexArray = GlUtil.createFloatBuffer(fullRectangleCoords); + mVertexCount = fullRectangleCoords.length / COORDS_PER_VERTEX; } - public void updateTexCoordArray(float[] FULL_RECTANGLE_TEX_COORDS) { - mTexCoordArray = GlUtil.createFloatBuffer(FULL_RECTANGLE_TEX_COORDS); + /** + * Update tex coord array. + * + * @param fullRectangleTexCoords the full rectangle tex coords + */ + public void updateTexCoordArray(float[] fullRectangleTexCoords) { + mTexCoordArray = GlUtil.createFloatBuffer(fullRectangleTexCoords); } + /** + * Update tex coord array fb. + * + * @param coords the coords + */ public void updateTexCoordArrayFB(float[] coords) { mTexCoordArrayFB = GlUtil.createFloatBuffer(coords); } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Extensions.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Extensions.java index 1b90c1b7c..2d43b82b7 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Extensions.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Extensions.java @@ -30,8 +30,17 @@ import java.io.IOException; import java.io.InputStream; +/** + * The type Extensions. + */ public abstract class Extensions { + /** + * Get bytes byte [ ]. + * + * @param inputStream the input stream + * @return the byte [ ] + */ public static byte[] getBytes(InputStream inputStream) { try { byte[] bytes = new byte[inputStream.available()]; @@ -45,6 +54,13 @@ public static byte[] getBytes(InputStream inputStream) { return new byte[0]; } + /** + * Get bytes byte [ ]. + * + * @param assetManager the asset manager + * @param fileName the file name + * @return the byte [ ] + */ public static byte[] getBytes(AssetManager assetManager, String fileName) { try { return getBytes(assetManager.open(fileName)); @@ -55,6 +71,13 @@ public static byte[] getBytes(AssetManager assetManager, String fileName) { return new byte[0]; } + /** + * Read text file from resource string. + * + * @param context the context + * @param resourceId the resource id + * @return the string + */ public static String readTextFileFromResource(Context context, int resourceId) { return new String(Extensions.getBytes(context.getResources().openRawResource(resourceId))); } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/GlUtil.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/GlUtil.java index 751e87e99..3b0574d54 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/GlUtil.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/GlUtil.java @@ -48,8 +48,14 @@ * Some OpenGL utility functions. */ public abstract class GlUtil { + /** + * The constant TAG. + */ public static final String TAG = GlUtil.class.getSimpleName(); + /** + * The constant NO_TEXTURE. + */ public static final int NO_TEXTURE = -1; // public static final int TYPE_FITXY=0; // public static final int TYPE_CENTERCROP=1; @@ -57,8 +63,14 @@ public abstract class GlUtil { // public static final int TYPE_FITSTART=3; // public static final int TYPE_FITEND=4; - public static float x_scale = 1.0f; - public static float y_scale = 1.0f; + /** + * The constant x_scale. + */ + public static final float X_SCALE = 1.0f; + /** + * The constant y_scale. + */ + public static final float Y_SCALE = 1.0f; /** * Identity matrix for general use. Don't modify or life will get weird. @@ -79,6 +91,8 @@ private GlUtil() { /** * Creates a new program from the supplied vertex and fragment shaders. * + * @param vertexSource the vertex source + * @param fragmentSource the fragment source * @return A handle to the program, or 0 on failure. */ public static int createProgram(String vertexSource, String fragmentSource) { @@ -115,6 +129,8 @@ public static int createProgram(String vertexSource, String fragmentSource) { /** * Compiles the provided shader source. * + * @param shaderType the shader type + * @param source the source * @return A handle to the shader, or 0 on failure. */ public static int loadShader(int shaderType, String source) { @@ -135,6 +151,8 @@ public static int loadShader(int shaderType, String source) { /** * Checks to see if a GLES error has been raised. + * + * @param op the op */ public static void checkGlError(String op) { int error = GLES20.glGetError(); @@ -149,6 +167,9 @@ public static void checkGlError(String op) { * could not be found, but does not set the GL error. *

    * Throws a RuntimeException if the location is invalid. + * + * @param location the location + * @param label the label */ public static void checkLocation(int location, String label) { if (location < 0) { @@ -157,7 +178,6 @@ public static void checkLocation(int location, String label) { } - /** * Creates a texture from raw data. * @@ -201,7 +221,9 @@ public static int createImageTexture(ByteBuffer data, int width, int height, int * @return Handle to texture. */ public static int createImageTexture(Bitmap bmp) { - if (null == bmp || bmp.isRecycled())return NO_TEXTURE; + if (null == bmp || bmp.isRecycled()) { + return NO_TEXTURE; + } int[] textureHandles = new int[1]; int textureHandle; GLES20.glGenTextures(1, textureHandles, 0); @@ -232,6 +254,9 @@ public static int createImageTexture(Bitmap bmp) { /** * Allocates a direct float buffer, and populates it with the float array data. + * + * @param coords the coords + * @return the float buffer */ public static FloatBuffer createFloatBuffer(float[] coords) { // Allocate a direct ByteBuffer, using 4 bytes per float, and copy coords into it. @@ -243,6 +268,15 @@ public static FloatBuffer createFloatBuffer(float[] coords) { return fb; } + /** + * Change mvp matrix crop float [ ]. + * + * @param viewWidth the view width + * @param viewHeight the view height + * @param textureWidth the texture width + * @param textureHeight the texture height + * @return the float [ ] + */ public static float[] changeMVPMatrixCrop(float viewWidth, float viewHeight, float textureWidth, float textureHeight) { float scale = viewWidth * textureHeight / viewHeight / textureWidth; float[] mvp = new float[16]; @@ -255,6 +289,9 @@ public static float[] changeMVPMatrixCrop(float viewWidth, float viewHeight, flo * Creates a texture object suitable for use with this program. *

    * On exit, the texture will be bound. + * + * @param textureTarget the texture target + * @return the int */ public static int createTextureObject(int textureTarget) { int[] textures = new int[1]; @@ -274,18 +311,37 @@ public static int createTextureObject(int textureTarget) { return texId; } + /** + * Delete texture id. + * + * @param textureId the texture id + */ public static void deleteTextureId(int[] textureId) { if (textureId != null && textureId.length > 0) { GLES20.glDeleteTextures(textureId.length, textureId, 0); } } + + /** + * Delete texture id. + * + * @param textureId the texture id + */ public static void deleteTextureId(int textureId) { int[] textures = new int[1]; - textures[0]= textureId; + textures[0] = textureId; GLES20.glDeleteTextures(textures.length, textures, 0); } + /** + * Create fbo. + * + * @param fboTex the fbo tex + * @param fboId the fbo id + * @param width the width + * @param height the height + */ public static void createFBO(int[] fboTex, int[] fboId, int width, int height) { //generate fbo id GLES20.glGenFramebuffers(1, fboId, 0); @@ -309,12 +365,27 @@ public static void createFBO(int[] fboTex, int[] fboId, int width, int height) { GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); } + /** + * Delete fbo. + * + * @param fboId the fbo id + */ public static void deleteFBO(int[] fboId) { if (fboId != null && fboId.length > 0) { GLES20.glDeleteFramebuffers(fboId.length, fboId, 0); } } + /** + * Change mvp matrix crop float [ ]. + * + * @param mvpMatrix the mvp matrix + * @param viewWidth the view width + * @param viewHeight the view height + * @param textureWidth the texture width + * @param textureHeight the texture height + * @return the float [ ] + */ public static float[] changeMVPMatrixCrop(float[] mvpMatrix, float viewWidth, float viewHeight, float textureWidth, float textureHeight) { float scale = viewWidth * textureHeight / viewHeight / textureWidth; if (scale == 1.0f) { @@ -330,74 +401,106 @@ public static float[] changeMVPMatrixCrop(float[] mvpMatrix, float viewWidth, fl } - public static void getShowMatrix(float[] matrix,int imgWidth,int imgHeight,int viewWidth,int viewHeight){ - if(imgHeight>0&&imgWidth>0&&viewWidth>0&&viewHeight>0){ - float sWhView=(float)viewWidth/viewHeight; - float sWhImg=(float)imgWidth/imgHeight; - float[] projection=new float[16]; - float[] camera=new float[16]; - if(sWhImg>sWhView){ - Matrix.orthoM(projection,0,-sWhView/sWhImg,sWhView/sWhImg,-1,1,1,3); - }else{ - Matrix.orthoM(projection,0,-1,1,-sWhImg/sWhView,sWhImg/sWhView,1,3); + /** + * Gets show matrix. + * + * @param matrix the matrix + * @param imgWidth the img width + * @param imgHeight the img height + * @param viewWidth the view width + * @param viewHeight the view height + */ + public static void getShowMatrix(float[] matrix, int imgWidth, int imgHeight, int viewWidth, int viewHeight) { + if (imgHeight > 0 && imgWidth > 0 && viewWidth > 0 && viewHeight > 0) { + float sWhView = (float) viewWidth / viewHeight; + float sWhImg = (float) imgWidth / imgHeight; + float[] projection = new float[16]; + float[] camera = new float[16]; + if (sWhImg > sWhView) { + Matrix.orthoM(projection, 0, -sWhView / sWhImg, sWhView / sWhImg, -1, 1, 1, 3); + } else { + Matrix.orthoM(projection, 0, -1, 1, -sWhImg / sWhView, sWhImg / sWhView, 1, 3); } - Matrix.setLookAtM(camera,0,0,0,1,0,0,0,0,1,0); - Matrix.multiplyMM(matrix,0,projection,0,camera,0); + Matrix.setLookAtM(camera, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0); + Matrix.multiplyMM(matrix, 0, projection, 0, camera, 0); } } + /** + * Gets show matrix. + * + * @param matrix the matrix + * @param type the type + * @param imgWidth the img width + * @param imgHeight the img height + * @param viewWidth the view width + * @param viewHeight the view height + */ public static void getShowMatrix(float[] matrix, ImageView.ScaleType type, int imgWidth, int imgHeight, int viewWidth, - int viewHeight){ - if(imgHeight>0&&imgWidth>0&&viewWidth>0&&viewHeight>0){ - float[] projection=new float[16]; - float[] camera=new float[16]; - if(type== FIT_XY){ - Matrix.orthoM(projection,0,-1,1,-1,1,1,3); - Matrix.setLookAtM(camera,0,0,0,1,0,0,0,0,1,0); - Matrix.multiplyMM(matrix,0,projection,0,camera,0); + int viewHeight) { + if (imgHeight > 0 && imgWidth > 0 && viewWidth > 0 && viewHeight > 0) { + float[] projection = new float[16]; + float[] camera = new float[16]; + if (type == FIT_XY) { + Matrix.orthoM(projection, 0, -1, 1, -1, 1, 1, 3); + Matrix.setLookAtM(camera, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0); + Matrix.multiplyMM(matrix, 0, projection, 0, camera, 0); } - float sWhView=(float)viewWidth/viewHeight; - float sWhImg=(float)imgWidth/imgHeight; - if(sWhImg>sWhView){ - switch (type){ + float sWhView = (float) viewWidth / viewHeight; + float sWhImg = (float) imgWidth / imgHeight; + if (sWhImg > sWhView) { + switch (type) { case CENTER_CROP: - Matrix.orthoM(projection,0,-sWhView/sWhImg,sWhView/sWhImg,-1,1,1,3); - Matrix.scaleM(projection,0,x_scale,y_scale,1); + Matrix.orthoM(projection, 0, -sWhView / sWhImg, sWhView / sWhImg, -1, 1, 1, 3); + Matrix.scaleM(projection, 0, X_SCALE, Y_SCALE, 1); break; case CENTER_INSIDE: - Matrix.orthoM(projection,0,-1,1,-sWhImg/sWhView,sWhImg/sWhView,1,3); + Matrix.orthoM(projection, 0, -1, 1, -sWhImg / sWhView, sWhImg / sWhView, 1, 3); break; case FIT_START: - Matrix.orthoM(projection,0,-1,1,1-2*sWhImg/sWhView,1,1,3); + Matrix.orthoM(projection, 0, -1, 1, 1 - 2 * sWhImg / sWhView, 1, 1, 3); break; case FIT_END: - Matrix.orthoM(projection,0,-1,1,-1,2*sWhImg/sWhView-1,1,3); + Matrix.orthoM(projection, 0, -1, 1, -1, 2 * sWhImg / sWhView - 1, 1, 3); break; + default: + // do nothing } - }else{ - switch (type){ + } else { + switch (type) { case CENTER_CROP: - Matrix.orthoM(projection,0,-1,1,-sWhImg/sWhView,sWhImg/sWhView,1,3); - Matrix.scaleM(projection,0,x_scale,y_scale,1); + Matrix.orthoM(projection, 0, -1, 1, -sWhImg / sWhView, sWhImg / sWhView, 1, 3); + Matrix.scaleM(projection, 0, X_SCALE, Y_SCALE, 1); break; case CENTER_INSIDE: - Matrix.orthoM(projection,0,-sWhView/sWhImg,sWhView/sWhImg,-1,1,1,3); + Matrix.orthoM(projection, 0, -sWhView / sWhImg, sWhView / sWhImg, -1, 1, 1, 3); break; case FIT_START: - Matrix.orthoM(projection,0,-1,2*sWhView/sWhImg-1,-1,1,1,3); + Matrix.orthoM(projection, 0, -1, 2 * sWhView / sWhImg - 1, -1, 1, 1, 3); break; case FIT_END: - Matrix.orthoM(projection,0,1-2*sWhView/sWhImg,1,-1,1,1,3); + Matrix.orthoM(projection, 0, 1 - 2 * sWhView / sWhImg, 1, -1, 1, 1, 3); break; + default: + // do nothing } } - Matrix.setLookAtM(camera,0,0,0,1,0,0,0,0,1,0); - Matrix.multiplyMM(matrix,0,projection,0,camera,0); + Matrix.setLookAtM(camera, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0); + Matrix.multiplyMM(matrix, 0, projection, 0, camera, 0); } } + /** + * Change mvp matrix inside float [ ]. + * + * @param viewWidth the view width + * @param viewHeight the view height + * @param textureWidth the texture width + * @param textureHeight the texture height + * @return the float [ ] + */ public static float[] changeMVPMatrixInside(float viewWidth, float viewHeight, float textureWidth, float textureHeight) { float scale = viewWidth * textureHeight / viewHeight / textureWidth; float[] mvp = new float[16]; @@ -409,8 +512,8 @@ public static float[] changeMVPMatrixInside(float viewWidth, float viewHeight, f /** * Prefer OpenGL ES 3.0, otherwise 2.0 * - * @param context - * @return + * @param context the context + * @return support gl version */ public static int getSupportGLVersion(Context context) { final ActivityManager activityManager = (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE); @@ -423,24 +526,55 @@ public static int getSupportGLVersion(Context context) { } - public static float[] rotate(float[] m,float angle){ - Matrix.rotateM(m,0,angle,0,0,1); + /** + * Rotate float [ ]. + * + * @param m the m + * @param angle the angle + * @return the float [ ] + */ + public static float[] rotate(float[] m, float angle) { + Matrix.rotateM(m, 0, angle, 0, 0, 1); return m; } - public static float[] flip(float[] m,boolean x,boolean y){ - if(x||y){ - Matrix.scaleM(m,0,x?-1:1,y?-1:1,1); + /** + * Flip float [ ]. + * + * @param m the m + * @param x the x + * @param y the y + * @return the float [ ] + */ + public static float[] flip(float[] m, boolean x, boolean y) { + if (x || y) { + Matrix.scaleM(m, 0, x ? -1 : 1, y ? -1 : 1, 1); } return m; } - public static float[] scale(float[] m,float x,float y){ - Matrix.scaleM(m,0,x,y,1); + /** + * Scale float [ ]. + * + * @param m the m + * @param x the x + * @param y the y + * @return the float [ ] + */ + public static float[] scale(float[] m, float x, float y) { + Matrix.scaleM(m, 0, x, y, 1); return m; } + /** + * Read pixles buffer byte buffer. + * + * @param textureId the texture id + * @param width the width + * @param height the height + * @return the byte buffer + */ public static ByteBuffer readPixlesBuffer(int textureId, int width, int height) { if (textureId == GlUtil.NO_TEXTURE) { @@ -478,7 +612,12 @@ public static ByteBuffer readPixlesBuffer(int textureId, int width, int height) return mCaptureBuffer; } - public static int getExternalOESTextureID(){ + /** + * Gets external oes texture id. + * + * @return the external oes texture id + */ + public static int getExternalOESTextureID() { int[] texture = new int[1]; GLES20.glGenTextures(1, texture, 0); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Program.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Program.java index 71571a0c5..3152c5606 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Program.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/Program.java @@ -31,44 +31,96 @@ import java.nio.ByteBuffer; +/** + * The type Program. + */ public abstract class Program { private static final String TAG = GlUtil.TAG; - // Handles to the GL program and various components of it. + /** + * The M program handle. + */ +// Handles to the GL program and various components of it. protected int mProgramHandle; + /** + * The M drawable 2 d. + */ protected Drawable2d mDrawable2d; + /** + * The M frame buffers. + */ protected int[] mFrameBuffers; + /** + * The M frame buffer textures. + */ protected int[] mFrameBufferTextures; - protected int FRAME_BUFFER_NUM = 1; + /** + * The Frame buffer num. + */ + protected int frameBufferNum = 1; + /** + * The M frame buffer shape. + */ protected Point mFrameBufferShape; + /** * Prepares the program in the current EGL context. + * + * @param vertexShader the vertex shader + * @param fragmentShader2D the fragment shader 2 d */ - public Program(String VERTEX_SHADER, String FRAGMENT_SHADER_2D) { - mProgramHandle = GlUtil.createProgram(VERTEX_SHADER, FRAGMENT_SHADER_2D); + public Program(String vertexShader, String fragmentShader2D) { + mProgramHandle = GlUtil.createProgram(vertexShader, fragmentShader2D); mDrawable2d = getDrawable2d(); getLocations(); } + /** + * Instantiates a new Program. + * + * @param context the context + * @param vertexShaderResourceId the vertex shader resource id + * @param fragmentShaderResourceId the fragment shader resource id + */ public Program(Context context, int vertexShaderResourceId, int fragmentShaderResourceId) { this(Extensions.readTextFileFromResource(context, vertexShaderResourceId), Extensions.readTextFileFromResource(context, fragmentShaderResourceId)); } - public void updateVertexArray(float[] FULL_RECTANGLE_COORDS) { - mDrawable2d.updateVertexArray(FULL_RECTANGLE_COORDS); + /** + * Update vertex array. + * + * @param fullRectangleCoords the full rectangle coords + */ + public void updateVertexArray(float[] fullRectangleCoords) { + mDrawable2d.updateVertexArray(fullRectangleCoords); } - public void updateTexCoordArray(float[] FULL_RECTANGLE_TEX_COORDS) { - mDrawable2d.updateTexCoordArray(FULL_RECTANGLE_TEX_COORDS); + /** + * Update tex coord array. + * + * @param fullRectangleTexCoords the full rectangle tex coords + */ + public void updateTexCoordArray(float[] fullRectangleTexCoords) { + mDrawable2d.updateTexCoordArray(fullRectangleTexCoords); } + /** + * Update tex coord array fb. + * + * @param coords the coords + */ public void updateTexCoordArrayFB(float[] coords) { mDrawable2d.updateTexCoordArrayFB(coords); } + /** + * Gets drawable 2 d. + * + * @return the drawable 2 d + */ protected abstract Drawable2d getDrawable2d(); /** @@ -78,15 +130,42 @@ public void updateTexCoordArrayFB(float[] coords) { /** * Issues the draw call. Does the full setup on every call. + * + * @param textureId the texture id + * @param width the width + * @param height the height + * @param mvpMatrix the mvp matrix */ public abstract void drawFrameOnScreen(int textureId, int width, int height, float[] mvpMatrix); + /** + * Draw frame off screen int. + * + * @param textureId the texture id + * @param width the width + * @param height the height + * @param mvpMatrix the mvp matrix + * @return the int + */ + public abstract int drawFrameOffScreen(int textureId, int width, int height, float[] mvpMatrix); - public abstract int drawFrameOffScreen(int textureId,int width, int height, float[] mvpMatrix); - + /** + * Read buffer byte buffer. + * + * @param textureId the texture id + * @param width the width + * @param height the height + * @return the byte buffer + */ public abstract ByteBuffer readBuffer(int textureId, int width, int height); + /** + * Init frame buffer if need. + * + * @param width the width + * @param height the height + */ protected void initFrameBufferIfNeed(int width, int height) { boolean need = false; if (null == mFrameBufferShape || mFrameBufferShape.x != width || mFrameBufferShape.y != height) { @@ -96,11 +175,11 @@ protected void initFrameBufferIfNeed(int width, int height) { need = true; } if (need) { - mFrameBuffers = new int[FRAME_BUFFER_NUM]; - mFrameBufferTextures = new int[FRAME_BUFFER_NUM]; - GLES20.glGenFramebuffers(FRAME_BUFFER_NUM, mFrameBuffers, 0); - GLES20.glGenTextures(FRAME_BUFFER_NUM, mFrameBufferTextures, 0); - for (int i = 0; i < FRAME_BUFFER_NUM; i++) { + mFrameBuffers = new int[frameBufferNum]; + mFrameBufferTextures = new int[frameBufferNum]; + GLES20.glGenFramebuffers(frameBufferNum, mFrameBuffers, 0); + GLES20.glGenTextures(frameBufferNum, mFrameBufferTextures, 0); + for (int i = 0; i < frameBufferNum; i++) { bindFrameBuffer(mFrameBufferTextures[i], mFrameBuffers[i], width, height); } mFrameBufferShape = new Point(width, height); @@ -111,26 +190,15 @@ protected void initFrameBufferIfNeed(int width, int height) { private void destroyFrameBuffers() { if (mFrameBufferTextures != null) { - GLES20.glDeleteTextures(FRAME_BUFFER_NUM, mFrameBufferTextures, 0); + GLES20.glDeleteTextures(frameBufferNum, mFrameBufferTextures, 0); mFrameBufferTextures = null; } if (mFrameBuffers != null) { - GLES20.glDeleteFramebuffers(FRAME_BUFFER_NUM, mFrameBuffers, 0); + GLES20.glDeleteFramebuffers(frameBufferNum, mFrameBuffers, 0); mFrameBuffers = null; } } - /** {zh} - * 绾圭悊鍙傛暟璁剧疆+buffer缁戝畾 - * set texture params - * and bind buffer - */ - /** {en} - * Texture parameter setting + buffer binding - * set texture params - * and binding buffer - */ - private void bindFrameBuffer(int textureId, int frameBuffer, int width, int height) { GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramManager.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramManager.java index 73e884fd0..d227e6da6 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramManager.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramManager.java @@ -27,19 +27,25 @@ import com.bytedance.labcv.effectsdk.BytedEffectConstants; +/** + * The type Program manager. + */ public class ProgramManager { - public ProgramManager() { - } - private ProgramTexture2d mProgramTexture2D; private ProgramTextureOES mProgramTextureOES; - public Program getProgram(BytedEffectConstants.TextureFormat srcTetxureFormat){ - switch (srcTetxureFormat){ + /** + * Gets program. + * + * @param srcTetxureFormat the src tetxure format + * @return the program + */ + public Program getProgram(BytedEffectConstants.TextureFormat srcTetxureFormat) { + switch (srcTetxureFormat) { case Texure2D: - if (null == mProgramTexture2D){ + if (null == mProgramTexture2D) { mProgramTexture2D = new ProgramTexture2d(); } return mProgramTexture2D; @@ -48,18 +54,21 @@ public Program getProgram(BytedEffectConstants.TextureFormat srcTetxureFormat){ mProgramTextureOES = new ProgramTextureOES(); } return mProgramTextureOES; + default: + return null; } - return null; - } - public void release(){ - if (null != mProgramTexture2D){ + /** + * Release. + */ + public void release() { + if (null != mProgramTexture2D) { mProgramTexture2D.release(); mProgramTexture2D = null; } - if (null != mProgramTextureOES){ + if (null != mProgramTextureOES) { mProgramTextureOES.release(); mProgramTextureOES = null; diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTexture2d.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTexture2d.java index b81a0525f..3aab4a67e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTexture2d.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTexture2d.java @@ -32,32 +32,38 @@ import java.nio.ByteBuffer; +/** + * The type Program texture 2 d. + */ public class ProgramTexture2d extends Program { // Simple vertex shader, used for all programs. private static final String VERTEX_SHADER = - "uniform mat4 uMVPMatrix;\n" + - "attribute vec4 aPosition;\n" + - "attribute vec2 aTextureCoord;\n" + - "varying vec2 vTextureCoord;\n" + - "void main() {\n" + - " gl_Position = uMVPMatrix * aPosition;\n" + - " vTextureCoord = aTextureCoord;\n" + - "}\n"; + "uniform mat4 uMVPMatrix;\n" + + "attribute vec4 aPosition;\n" + + "attribute vec2 aTextureCoord;\n" + + "varying vec2 vTextureCoord;\n" + + "void main() {\n" + + " gl_Position = uMVPMatrix * aPosition;\n" + + " vTextureCoord = aTextureCoord;\n" + + "}\n"; // Simple fragment shader for use with "normal" 2D textures. private static final String FRAGMENT_SHADER_2D = - "precision mediump float;\n" + - "varying vec2 vTextureCoord;\n" + - "uniform sampler2D sTexture;\n" + - "void main() {\n" + - " gl_FragColor = texture2D(sTexture, vTextureCoord);\n" + - "}\n"; + "precision mediump float;\n" + + "varying vec2 vTextureCoord;\n" + + "uniform sampler2D sTexture;\n" + + "void main() {\n" + + " gl_FragColor = texture2D(sTexture, vTextureCoord);\n" + + "}\n"; private int muMVPMatrixLoc; private int maPositionLoc; private int maTextureCoordLoc; + /** + * Instantiates a new Program texture 2 d. + */ public ProgramTexture2d() { super(VERTEX_SHADER, FRAGMENT_SHADER_2D); } @@ -116,7 +122,6 @@ public void drawFrameOnScreen(int textureId, int width, int height, float[] mvpM GLES20.glViewport(0, 0, width, height); - // Draw the rect. GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, mDrawable2d.getVertexCount()); GlUtil.checkGlError("glDrawArrays"); @@ -187,29 +192,17 @@ public int drawFrameOffScreen(int textureId, int width, int height, float[] mvpM return mFrameBufferTextures[0]; } - /** {zh} - * 璇诲彇娓叉煋缁撴灉鐨刡uffer - * @param width 鐩爣瀹藉害 - * @param height 鐩爣楂樺害 - * @return 娓叉煋缁撴灉鐨勫儚绱燘uffer 鏍煎紡RGBA - */ - /** {en} - * Read the buffer - * @param width target width - * @param height target height - * @return pixel Buffer format of the rendered result RGBA - */ - private int mWidth = 0; private int mHeight = 0; private ByteBuffer mCaptureBuffer = null; + @Override public ByteBuffer readBuffer(int textureId, int width, int height) { - if ( textureId == GlUtil.NO_TEXTURE) { + if (textureId == GlUtil.NO_TEXTURE) { return null; } - if (width* height == 0){ - return null; + if (width * height == 0) { + return null; } if (mCaptureBuffer == null || mWidth * mHeight != width * height) { @@ -219,7 +212,7 @@ public ByteBuffer readBuffer(int textureId, int width, int height) { } mCaptureBuffer.position(0); int[] frameBuffer = new int[1]; - GLES20.glGenFramebuffers(1,frameBuffer,0); + GLES20.glGenFramebuffers(1, frameBuffer, 0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureOES.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureOES.java index c2667f4e7..56fd0f840 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureOES.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureOES.java @@ -32,29 +32,32 @@ import java.nio.ByteBuffer; +/** + * The type Program texture oes. + */ public class ProgramTextureOES extends Program { // Simple vertex shader, used for all programs. private static final String VERTEX_SHADER = - "uniform mat4 uMVPMatrix;\n" + - "attribute vec4 aPosition;\n" + - "attribute vec2 aTextureCoord;\n" + - "varying vec2 vTextureCoord;\n" + - "void main() {\n" + - " gl_Position = uMVPMatrix * aPosition;\n" + - " vTextureCoord = aTextureCoord;\n" + - "}\n"; + "uniform mat4 uMVPMatrix;\n" + + "attribute vec4 aPosition;\n" + + "attribute vec2 aTextureCoord;\n" + + "varying vec2 vTextureCoord;\n" + + "void main() {\n" + + " gl_Position = uMVPMatrix * aPosition;\n" + + " vTextureCoord = aTextureCoord;\n" + + "}\n"; // Simple fragment shader for use with external 2D textures (e.g. what we get from // SurfaceTexture). private static final String FRAGMENT_SHADER_EXT = - "#extension GL_OES_EGL_image_external : require\n" + - "precision mediump float;\n" + - "varying vec2 vTextureCoord;\n" + - "uniform samplerExternalOES sTexture;\n" + - "void main() {\n" + - " gl_FragColor = texture2D(sTexture, vTextureCoord);\n" + - "}\n"; + "#extension GL_OES_EGL_image_external : require\n" + + "precision mediump float;\n" + + "varying vec2 vTextureCoord;\n" + + "uniform samplerExternalOES sTexture;\n" + + "void main() {\n" + + " gl_FragColor = texture2D(sTexture, vTextureCoord);\n" + + "}\n"; private int muMVPMatrixLoc; private int maPositionLoc; @@ -83,7 +86,7 @@ protected void getLocations() { } @Override - public void drawFrameOnScreen(int textureId,int width, int height, float[] mvpMatrix) { + public void drawFrameOnScreen(int textureId, int width, int height, float[] mvpMatrix) { GlUtil.checkGlError("draw start"); // Select the program. @@ -156,7 +159,6 @@ public int drawFrameOffScreen(int textureId, int width, int height, float[] mvpM GlUtil.checkGlError("glUniformMatrix4fv"); - // Enable the "aPosition" vertex attribute. GLES20.glEnableVertexAttribArray(maPositionLoc); GlUtil.checkGlError("glEnableVertexAttribArray"); @@ -192,33 +194,28 @@ public int drawFrameOffScreen(int textureId, int width, int height, float[] mvpM } - /** {zh} - * 璇诲彇娓叉煋缁撴灉鐨刡uffer - * @param width 鐩爣瀹藉害 - * @param height 鐩爣楂樺害 - * @return 娓叉煋缁撴灉鐨勫儚绱燘uffer 鏍煎紡RGBA - */ - /** {en} + /** + * {en} * Read the buffer - * @param width target width + * + * @param width target width * @param height target height * @return pixel Buffer format of the rendered result RGBA */ - @Override public ByteBuffer readBuffer(int textureId, int width, int height) { - if ( textureId == GlUtil.NO_TEXTURE) { + if (textureId == GlUtil.NO_TEXTURE) { return null; } - if (width* height == 0){ - return null; + if (width * height == 0) { + return null; } - ByteBuffer mCaptureBuffer = ByteBuffer.allocateDirect(width* height*4); + ByteBuffer mCaptureBuffer = ByteBuffer.allocateDirect(width * height * 4); mCaptureBuffer.position(0); int[] frameBuffer = new int[1]; - GLES20.glGenFramebuffers(1,frameBuffer,0); + GLES20.glGenFramebuffers(1, frameBuffer, 0); GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, textureId); GLES20.glTexParameterf(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureYUV.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureYUV.java index 14a992368..8b68d35ef 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureYUV.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/bytedance/utils/opengl/ProgramTextureYUV.java @@ -41,6 +41,9 @@ public class ProgramTextureYUV extends Program { private int mVTextureLoc; private int mVUTextureLoc; + /** + * Instantiates a new Program texture yuv. + */ public ProgramTextureYUV() { super(VERTEX, FRAGMENT); } @@ -69,6 +72,17 @@ protected void getLocations() { GlUtil.checkLocation(muMVPMatrixLoc, "vuTexture"); } + /** + * Draw frame off screen int. + * + * @param yTexture the y texture + * @param uTexture the u texture + * @param vTexture the v texture + * @param width the width + * @param height the height + * @param mvpMatrix the mvp matrix + * @return the int + */ public int drawFrameOffScreen(int yTexture, int uTexture, int vTexture, int width, int height, float[] mvpMatrix) { GlUtil.checkGlError("draw start"); @@ -124,6 +138,16 @@ public int drawFrameOffScreen(int yTexture, int uTexture, int vTexture, int widt return mFrameBufferTextures[0]; } + /** + * Draw frame off screen int. + * + * @param yTexture the y texture + * @param vuTexture the vu texture + * @param width the width + * @param height the height + * @param mvpMatrix the mvp matrix + * @return the int + */ public int drawFrameOffScreen(int yTexture, int vuTexture, int width, int height, float[] mvpMatrix) { GlUtil.checkGlError("draw start"); @@ -188,33 +212,39 @@ public ByteBuffer readBuffer(int textureId, int width, int height) { return null; } - public static final String VERTEX = "uniform mat4 uMVPMatrix;\n" + - "attribute vec4 aPosition;\n" + - "attribute vec2 aTextureCoord;\n" + - "varying vec2 vTextureCoord;\n" + - "void main() {\n" + - " gl_Position = uMVPMatrix * aPosition;\n" + - " vTextureCoord = aTextureCoord;\n" + - "}\n"; - public static final String FRAGMENT = "varying highp vec2 vTextureCoord;\n" + - " uniform sampler2D yTexture;\n" + - " uniform sampler2D vuTexture;\n" + - " uniform sampler2D uTexture;\n" + - " uniform sampler2D vTexture;\n" + - " void main()\n" + - " {\n" + - " mediump vec3 yuv;\n" + - " lowp vec3 rgb;\n" + - " yuv.x = texture2D(yTexture, vTextureCoord).a - 0.065;\n" + - " yuv.y = texture2D(vuTexture, vTextureCoord).a - 0.5;\n" + - " yuv.z = texture2D(vuTexture, vTextureCoord).r - 0.5;\n" + -// " rgb = mat3( 1, 1, 1,\n" + -// " 0, -.21482, 2.12798,\n" + -// " 1.28033, -.38059, 0) * yuv;\n" + - " rgb.x = yuv.x + 1.4075 * yuv.z;\n" + - " rgb.y = yuv.x - 0.3455 * yuv.y - 0.7169 * yuv.z;\n" + - " rgb.z = yuv.x + 1.779 * yuv.y;\n" + -// " gl_FragColor = vec4(rgb.x, rgb.y, rgb.z, 1);\n" + - " gl_FragColor = vec4(rgb.x, rgb.y, rgb.z, 1);\n" + - " }"; + /** + * The constant VERTEX. + */ + public static final String VERTEX = "uniform mat4 uMVPMatrix;\n" + + "attribute vec4 aPosition;\n" + + "attribute vec2 aTextureCoord;\n" + + "varying vec2 vTextureCoord;\n" + + "void main() {\n" + + " gl_Position = uMVPMatrix * aPosition;\n" + + " vTextureCoord = aTextureCoord;\n" + + "}\n"; + /** + * The constant FRAGMENT. + */ + public static final String FRAGMENT = "varying highp vec2 vTextureCoord;\n" + + " uniform sampler2D yTexture;\n" + + " uniform sampler2D vuTexture;\n" + + " uniform sampler2D uTexture;\n" + + " uniform sampler2D vTexture;\n" + + " void main()\n" + + " {\n" + + " mediump vec3 yuv;\n" + + " lowp vec3 rgb;\n" + + " yuv.x = texture2D(yTexture, vTextureCoord).a - 0.065;\n" + + " yuv.y = texture2D(vuTexture, vTextureCoord).a - 0.5;\n" + + " yuv.z = texture2D(vuTexture, vTextureCoord).r - 0.5;\n" +// + " rgb = mat3( 1, 1, 1,\n" +// + " 0, -.21482, 2.12798,\n" +// + " 1.28033, -.38059, 0) * yuv;\n" + + " rgb.x = yuv.x + 1.4075 * yuv.z;\n" + + " rgb.y = yuv.x - 0.3455 * yuv.y - 0.7169 * yuv.z;\n" + + " rgb.z = yuv.x + 1.779 * yuv.y;\n" +// + " gl_FragColor = vec4(rgb.x, rgb.y, rgb.z, 1);\n" + + " gl_FragColor = vec4(rgb.x, rgb.y, rgb.z, 1);\n" + + " }"; } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPI.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPI.kt index 09e7db8db..f7a07a85e 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPI.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPI.kt @@ -31,13 +31,37 @@ import io.agora.base.VideoFrame import io.agora.rtc2.Constants import io.agora.rtc2.RtcEngine +/** + * Version + */ const val VERSION = "1.0.3" +/** + * Capture mode + * + * @constructor Create empty Capture mode + */ enum class CaptureMode{ + /** + * Agora + * + * @constructor Create empty Agora + */ Agora, // 浣跨敤澹扮綉鍐呴儴鐨勭ゼ鏁版嵁鎺ュ彛杩涜澶勭悊 + + /** + * Custom + * + * @constructor Create empty Custom + */ Custom // 鑷畾涔夋ā寮忥紝闇瑕佽嚜宸辫皟鐢╫nFrame鎺ュ彛灏嗗師濮嬭棰戝抚浼犵粰BeautyAPI鍋氬鐞 } +/** + * I event callback + * + * @constructor Create empty I event callback + */ interface IEventCallback{ /** @@ -48,27 +72,83 @@ interface IEventCallback{ fun onBeautyStats(stats: BeautyStats) } +/** + * Beauty stats + * + * @property minCostMs + * @property maxCostMs + * @property averageCostMs + * @constructor Create empty Beauty stats + */ data class BeautyStats( val minCostMs:Long, // 缁熻鍖洪棿鍐呯殑鏈灏忓 val maxCostMs: Long, // 缁熻鍖洪棿鍐呯殑鏈澶у val averageCostMs: Long // 缁熻鍖洪棿鍐呯殑骞冲潎鍊 ) +/** + * Mirror mode + * + * @constructor Create empty Mirror mode + */ enum class MirrorMode { // 娌℃湁闀滃儚姝e父鐢婚潰鐨勫畾涔夛細鍓嶇疆鎷嶅埌鐢婚潰鍜屾墜鏈虹湅鍒扮敾闈㈡槸宸﹀彸涓嶄竴鑷寸殑锛屽悗缃媿鍒扮敾闈㈠拰鎵嬫満鐪嬪埌鐢婚潰鏄乏鍙充竴鑷寸殑 + /** + * Mirror Local Remote + * + * @constructor Create empty Mirror Local Remote + */ MIRROR_LOCAL_REMOTE, //鏈湴杩滅閮介暅鍍忥紝鍓嶇疆榛樿锛屾湰鍦板拰杩滅璐寸焊閮芥甯 + + /** + * Mirror Local Only + * + * @constructor Create empty Mirror Local Only + */ MIRROR_LOCAL_ONLY, // 浠呮湰鍦伴暅鍍忥紝杩滅涓嶉暅鍍忥紝锛岃繙绔创绾告甯革紝鏈湴璐寸焊闀滃儚銆傜敤浜庢墦鐢佃瘽鍦烘櫙锛岀數鍟嗙洿鎾満鏅(淇濊瘉鐢靛晢鐩存挱鍚庨潰鐨勫憡绀虹墝鏂囧瓧鏄鐨)锛涜繖绉嶆ā寮忓洜涓烘湰鍦拌繙绔槸鍙嶇殑锛屾墍浠ヨ偗瀹氭湁涓杈圭殑鏂囧瓧璐寸焊鏂瑰悜浼氭槸鍙嶇殑 + + /** + * Mirror Remote Only + * + * @constructor Create empty Mirror Remote Only + */ MIRROR_REMOTE_ONLY, // 浠呰繙绔暅鍍忥紝鏈湴涓嶉暅鍍忥紝杩滅璐寸焊姝e父锛屾湰鍦拌创绾搁暅鍍 + + /** + * Mirror None + * + * @constructor Create empty Mirror None + */ MIRROR_NONE // 鏈湴杩滅閮戒笉闀滃儚锛屽悗缃粯璁わ紝鏈湴鍜岃繙绔创绾搁兘姝e父 } +/** + * Camera config + * + * @property frontMirror + * @property backMirror + * @constructor Create empty Camera config + */ data class CameraConfig( val frontMirror: MirrorMode = MirrorMode.MIRROR_LOCAL_REMOTE, // 鍓嶇疆榛樿闀滃儚锛氭湰鍦拌繙绔兘闀滃儚 val backMirror: MirrorMode = MirrorMode.MIRROR_NONE // 鍚庣疆榛樿闀滃儚锛氭湰鍦拌繙绔兘涓嶉暅鍍 ) +/** + * Config + * + * @property context + * @property rtcEngine + * @property fuRenderKit + * @property eventCallback + * @property captureMode + * @property statsDuration + * @property statsEnable + * @property cameraConfig + * @constructor Create empty Config + */ data class Config( val context: Context, // Android Context 涓婁笅鏂 val rtcEngine: RtcEngine, // 澹扮綉Rtc寮曟搸 @@ -80,24 +160,103 @@ data class Config( val cameraConfig: CameraConfig = CameraConfig() // 鎽勫儚澶撮暅鍍忛厤缃 ) +/** + * Error code + * + * @property value + * @constructor Create empty Error code + */ enum class ErrorCode(val value: Int) { + /** + * Error Ok + * + * @constructor Create empty Error Ok + */ ERROR_OK(0), // 涓鍒囨甯 + + /** + * Error Has Not Initialized + * + * @constructor Create empty Error Has Not Initialized + */ ERROR_HAS_NOT_INITIALIZED(101), // 娌℃湁璋冪敤Initialize鎴栬皟鐢ㄥけ璐ユ儏鍐典笅璋冪敤浜嗗叾浠朅PI + + /** + * Error Has Initialized + * + * @constructor Create empty Error Has Initialized + */ ERROR_HAS_INITIALIZED(102), // 宸茬粡Initialize鎴愬姛鍚庡啀娆¤皟鐢ㄦ姤閿 + + /** + * Error Has Released + * + * @constructor Create empty Error Has Released + */ ERROR_HAS_RELEASED(103), // 宸茬粡璋冪敤release閿姣佸悗杩樿皟鐢ㄥ叾浠朅PI + + /** + * Error Process Not Custom + * + * @constructor Create empty Error Process Not Custom + */ ERROR_PROCESS_NOT_CUSTOM(104), // 闈濩ustom澶勭悊妯″紡涓嬭皟鐢╫nFrame鎺ュ彛浠庡閮ㄤ紶鍏ヨ棰戝抚 + + /** + * Error Process Disable + * + * @constructor Create empty Error Process Disable + */ ERROR_PROCESS_DISABLE(105), // 褰撹皟鐢╡nable(false)绂佺敤缇庨鍚庤皟鐢╫nFrame鎺ュ彛杩斿洖 + + /** + * Error View Type Error + * + * @constructor Create empty Error View Type Error + */ ERROR_VIEW_TYPE_ERROR(106), // 褰撹皟鐢╯etupLocalVideo鏃秜iew绫诲瀷閿欒鏃惰繑鍥 + + /** + * Error Frame Skipped + * + * @constructor Create empty Error Frame Skipped + */ ERROR_FRAME_SKIPPED(107), // 褰撳鐞嗗抚蹇界暐鏃跺湪onFrame杩斿洖 } +/** + * Beauty preset + * + * @constructor Create empty Beauty preset + */ enum class BeautyPreset { + /** + * Custom + * + * @constructor Create empty Custom + */ CUSTOM, // 涓嶄娇鐢ㄦ帹鑽愮殑缇庨鍙傛暟 + + /** + * Default + * + * @constructor Create empty Default + */ DEFAULT // 榛樿鐨 } +/** + * Create face unity beauty a p i + * + * @return + */ fun createFaceUnityBeautyAPI(): FaceUnityBeautyAPI = FaceUnityBeautyAPIImpl() +/** + * Face unity beauty a p i + * + * @constructor Create empty Face unity beauty a p i + */ interface FaceUnityBeautyAPI { /** @@ -151,6 +310,11 @@ interface FaceUnityBeautyAPI { */ fun isFrontCamera(): Boolean + /** + * Get mirror applied + * + * @return + */ fun getMirrorApplied(): Boolean /** diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPIImpl.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPIImpl.kt index c6f99c4f7..84fb33481 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPIImpl.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/FaceUnityBeautyAPIImpl.kt @@ -58,6 +58,11 @@ import java.io.File import java.nio.ByteBuffer import java.util.concurrent.Callable +/** + * Face unity beauty a p i impl + * + * @constructor Create empty Face unity beauty a p i impl + */ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { private val TAG = "FaceUnityBeautyAPIImpl" private val reportId = "scenarioAPI" @@ -79,9 +84,32 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { private var statsHelper: StatsHelper? = null private var skipFrame = 0 private enum class ProcessSourceType{ + /** + * Unknown + * + * @constructor Create empty Unknown + */ UNKNOWN, + + /** + * Texture Oes Async + * + * @constructor Create empty Texture Oes Async + */ TEXTURE_OES_ASYNC, + + /** + * Texture 2d Async + * + * @constructor Create empty Texture 2d Async + */ TEXTURE_2D_ASYNC, + + /** + * I420 + * + * @constructor Create empty I420 + */ I420 } private var currProcessSourceType = ProcessSourceType.UNKNOWN @@ -90,6 +118,12 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { private var cameraConfig = CameraConfig() private var localVideoRenderMode = Constants.RENDER_MODE_HIDDEN + /** + * Initialize + * + * @param config + * @return + */ override fun initialize(config: Config): Int { if (this.config != null) { LogUtils.e(TAG, "initialize >> The beauty api has been initialized!") @@ -119,6 +153,12 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Enable + * + * @param enable + * @return + */ override fun enable(enable: Boolean): Int { LogUtils.i(TAG, "enable >> enable = $enable") if (config == null) { @@ -143,6 +183,13 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Setup local video + * + * @param view + * @param renderMode + * @return + */ override fun setupLocalVideo(view: View, renderMode: Int): Int { val rtcEngine = config?.rtcEngine if(rtcEngine == null){ @@ -161,6 +208,12 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_VIEW_TYPE_ERROR.value } + /** + * On frame + * + * @param videoFrame + * @return + */ override fun onFrame(videoFrame: VideoFrame): Int { val conf = config if(conf == null){ @@ -185,6 +238,12 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_FRAME_SKIPPED.value } + /** + * Update camera config + * + * @param config + * @return + */ override fun updateCameraConfig(config: CameraConfig): Int { LogUtils.i(TAG, "updateCameraConfig >> oldCameraConfig=$cameraConfig, newCameraConfig=$config") cameraConfig = CameraConfig(config.frontMirror, config.backMirror) @@ -193,14 +252,30 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Is front camera + * + */ override fun isFrontCamera() = isFrontCamera + /** + * Set parameters + * + * @param key + * @param value + */ override fun setParameters(key: String, value: String) { when(key){ "beauty_mode" -> beautyMode = value.toInt() } } + /** + * Set beauty preset + * + * @param preset + * @return + */ override fun setBeautyPreset(preset: BeautyPreset): Int { val conf = config if(conf == null){ @@ -266,6 +341,11 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Release + * + * @return + */ override fun release(): Int { val fuRenderer = config?.fuRenderKit if(fuRenderer == null){ @@ -596,29 +676,75 @@ class FaceUnityBeautyAPIImpl : FaceUnityBeautyAPI, IVideoFrameObserver { // IVideoFrameObserver implements + /** + * On capture video frame + * + * @param sourceType + * @param videoFrame + * @return + */ override fun onCaptureVideoFrame(sourceType: Int, videoFrame: VideoFrame?): Boolean { videoFrame ?: return false return processBeauty(videoFrame) } + /** + * On pre encode video frame + * + * @param sourceType + * @param videoFrame + */ override fun onPreEncodeVideoFrame(sourceType: Int, videoFrame: VideoFrame?) = false + /** + * On media player video frame + * + * @param videoFrame + * @param mediaPlayerId + */ override fun onMediaPlayerVideoFrame(videoFrame: VideoFrame?, mediaPlayerId: Int) = false + /** + * On render video frame + * + * @param channelId + * @param uid + * @param videoFrame + */ override fun onRenderVideoFrame( channelId: String?, uid: Int, videoFrame: VideoFrame? ) = false + /** + * Get video frame process mode + * + */ override fun getVideoFrameProcessMode() = IVideoFrameObserver.PROCESS_MODE_READ_WRITE + /** + * Get video format preference + * + */ override fun getVideoFormatPreference() = IVideoFrameObserver.VIDEO_PIXEL_DEFAULT + /** + * Get rotation applied + * + */ override fun getRotationApplied() = false + /** + * Get mirror applied + * + */ override fun getMirrorApplied() = captureMirror && !enable + /** + * Get observed frame position + * + */ override fun getObservedFramePosition() = IVideoFrameObserver.POSITION_POST_CAPTURER } \ No newline at end of file diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/FuDeviceUtils.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/FuDeviceUtils.java index 5e03a313c..8e7397963 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/FuDeviceUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/FuDeviceUtils.java @@ -40,12 +40,31 @@ import java.io.InputStream; import java.io.InputStreamReader; -public class FuDeviceUtils { +/** + * The type Fu device utils. + */ +public final class FuDeviceUtils { + + private FuDeviceUtils() { + + } + /** + * The constant TAG. + */ public static final String TAG = "FuDeviceUtils"; + /** + * The constant DEVICE_LEVEL_HIGH. + */ public static final int DEVICE_LEVEL_HIGH = 2; + /** + * The constant DEVICE_LEVEL_MID. + */ public static final int DEVICE_LEVEL_MID = 1; + /** + * The constant DEVICE_LEVEL_LOW. + */ public static final int DEVICE_LEVEL_LOW = 0; /** @@ -148,7 +167,9 @@ public static int getCPUMaxFreqKHz() { try { int freqBound = parseFileForValue("cpu MHz", stream); freqBound *= 1024; //MHz -> kHz - if (freqBound > maxFreq) maxFreq = freqBound; + if (freqBound > maxFreq) { + maxFreq = freqBound; + } } finally { stream.close(); } @@ -245,7 +266,9 @@ private static int parseFileForValue(String textToMatch, FileInputStream stream) int length = stream.read(buffer); for (int i = 0; i < length; i++) { if (buffer[i] == '\n' || i == 0) { - if (buffer[i] == '\n') i++; + if (buffer[i] == '\n') { + i++; + } for (int j = i; j < length; j++) { int textIndex = j - i; //Text doesn't match query at some point. @@ -270,6 +293,7 @@ private static int parseFileForValue(String textToMatch, FileInputStream stream) * Helper method used by {@link #parseFileForValue(String, FileInputStream) parseFileForValue}. Parses * the next available number after the match in the file being read and returns it as an integer. * + * @param buffer Buffer. * @param index - The index in the buffer array to begin looking. * @return The next number on that line in the buffer, returned as an int. Returns * DEVICEINFO_UNKNOWN = -1 in the event that no more numbers exist on the same line. @@ -293,8 +317,8 @@ private static int extractValue(byte[] buffer, int index) { /** * 鑾峰彇褰撳墠鍓╀綑鍐呭瓨(ram) * - * @param context - * @return + * @param context the context + * @return avail memory */ public static long getAvailMemory(Context context) { ActivityManager am = (ActivityManager) context.getSystemService(Context.ACTIVITY_SERVICE); @@ -306,7 +330,7 @@ public static long getAvailMemory(Context context) { /** * 鑾峰彇鍘傚晢淇℃伅 * - * @return + * @return brand */ public static String getBrand() { return Build.BRAND; @@ -315,7 +339,7 @@ public static String getBrand() { /** * 鑾峰彇鎵嬫満鏈哄瀷 * - * @return + * @return model */ public static String getModel() { return Build.MODEL; @@ -324,7 +348,7 @@ public static String getModel() { /** * 鑾峰彇纭欢淇℃伅(cpu鍨嬪彿) * - * @return + * @return hard ware */ public static String getHardWare() { try { @@ -353,13 +377,15 @@ public static String getHardWare() { * Level judgement based on current memory and CPU. * * @param context - Context object. - * @return + * @return int */ public static int judgeDeviceLevel(Context context) { int level; //鏈変竴浜涜澶囦笉绗﹀悎涓嬭堪鐨勫垽鏂鍒欙紝鍒欒蛋涓涓満鍨嬪垽鏂ā寮 int specialDevice = judgeDeviceLevelInDeviceName(); - if (specialDevice >= 0) return specialDevice; + if (specialDevice >= 0) { + return specialDevice; + } int ramLevel = judgeMemory(context); int cpuLevel = judgeCPU(); @@ -372,29 +398,30 @@ public static int judgeDeviceLevel(Context context) { level = DEVICE_LEVEL_MID; } } - LogUtils.d(TAG,"DeviceLevel: " + level); + LogUtils.d(TAG, "DeviceLevel: " + level); return level; } /** * -1 涓嶆槸鐗瑰畾鐨勯珮浣庣鏈哄瀷 - * @return + * + * @return level. */ private static int judgeDeviceLevelInDeviceName() { String currentDeviceName = getDeviceName(); - for (String deviceName:upscaleDevice) { + for (String deviceName : UPSCALE_DEVICE) { if (deviceName.equals(currentDeviceName)) { return DEVICE_LEVEL_HIGH; } } - for (String deviceName:middleDevice) { + for (String deviceName : MIDDLE_DEVICES) { if (deviceName.equals(currentDeviceName)) { return DEVICE_LEVEL_MID; } } - for (String deviceName:lowDevice) { + for (String deviceName : LOW_DEVICES) { if (deviceName.equals(currentDeviceName)) { return DEVICE_LEVEL_LOW; } @@ -402,14 +429,24 @@ private static int judgeDeviceLevelInDeviceName() { return -1; } - public static final String[] upscaleDevice = {"vivo X6S A","MHA-AL00","VKY-AL00","V1838A"}; - public static final String[] lowDevice = {}; - public static final String[] middleDevice = {"OPPO R11s","PAR-AL00","MI 8 Lite","ONEPLUS A6000","PRO 6","PRO 7 Plus"}; + /** + * The constant upscaleDevice. + */ + public static final String[] UPSCALE_DEVICE = {"vivo X6S A", "MHA-AL00", "VKY-AL00", "V1838A"}; + /** + * The constant lowDevice. + */ + public static final String[] LOW_DEVICES = {}; + /** + * The constant middleDevice. + */ + public static final String[] MIDDLE_DEVICES = {"OPPO R11s", "PAR-AL00", "MI 8 Lite", "ONEPLUS A6000", "PRO 6", "PRO 7 Plus"}; /** * 璇勫畾鍐呭瓨鐨勭瓑绾. * - * @return + * @param context Context. + * @return level. */ private static int judgeMemory(Context context) { long ramMB = getTotalMemory(context) / (1024 * 1024); @@ -431,7 +468,7 @@ private static int judgeMemory(Context context) { /** * 璇勫畾CPU绛夌骇.锛堟寜棰戠巼鍜屽巶鍟嗗瀷鍙风患鍚堝垽鏂級 * - * @return + * @return level. */ private static int judgeCPU() { int level = 0; @@ -445,7 +482,8 @@ private static int judgeCPU() { return judgeQualcommCPU(cpuName, freqMHz); } else if (cpuName.contains("hi") || cpuName.contains("kirin")) { //娴锋濋簰楹 return judgeSkinCPU(cpuName, freqMHz); - } else if (cpuName.contains("MT")) {//鑱斿彂绉 + } else if (cpuName.contains("MT")) { + //鑱斿彂绉 return judgeMTCPU(cpuName, freqMHz); } } @@ -466,7 +504,9 @@ private static int judgeCPU() { /** * 鑱斿彂绉戣姱鐗囩瓑绾у垽瀹 * - * @return + * @param cpuName CPU Name. + * @param freqMHz CPU Freq MHz. + * @return level */ private static int judgeMTCPU(String cpuName, int freqMHz) { //P60涔嬪墠鐨勫叏鏄綆绔満 MT6771V/C @@ -508,8 +548,8 @@ private static int judgeMTCPU(String cpuName, int freqMHz) { /** * 閫氳繃鑱斿彂绉慍PU鍨嬪彿瀹氫箟 -> 鑾峰彇cpu version * - * @param cpuName - * @return + * @param cpuName CPU Name. + * @return CPU Version. */ private static int getMTCPUVersion(String cpuName) { //鎴彇MT鍚庨潰鐨勫洓浣嶆暟瀛 @@ -529,7 +569,9 @@ private static int getMTCPUVersion(String cpuName) { /** * 楂橀氶獊榫欒姱鐗囩瓑绾у垽瀹 * - * @return + * @param cpuName CPU Name. + * @param freqMHz CPU Freq MHz. + * @return level */ private static int judgeQualcommCPU(String cpuName, int freqMHz) { int level = 0; @@ -561,8 +603,9 @@ private static int judgeQualcommCPU(String cpuName, int freqMHz) { /** * 楹掗簾鑺墖绛夌骇鍒ゅ畾 * - * @param freqMHz - * @return + * @param cpuName CPU Name. + * @param freqMHz CPU Freq MHz. + * @return level */ private static int judgeSkinCPU(String cpuName, int freqMHz) { //鍨嬪彿 -> kirin710涔嬪悗 & 鏈楂樻牳蹇冮鐜 @@ -590,17 +633,22 @@ private static int judgeSkinCPU(String cpuName, int freqMHz) { return level; } - public static final String Nexus_6P = "Nexus 6P"; + /** + * The constant NEXUS_6P. + */ + public static final String NEXUS_6P = "Nexus 6P"; /** - * 鑾峰彇璁惧鍚 + * 鑾峰彇璁惧鍚嶃 * - * @return + * @return the device name */ public static String getDeviceName() { String deviceName = ""; - if (Build.MODEL != null) deviceName = Build.MODEL; - LogUtils.e(TAG,"deviceName: " + deviceName); + if (Build.MODEL != null) { + deviceName = Build.MODEL; + } + LogUtils.e(TAG, "deviceName: " + deviceName); return deviceName; } } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/LogUtils.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/LogUtils.kt index 483b8fd49..4722d73a7 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/LogUtils.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/LogUtils.kt @@ -32,6 +32,11 @@ import java.util.Date import java.util.Locale import java.util.concurrent.Executors +/** + * Log utils + * + * @constructor Create empty Log utils + */ object LogUtils { private const val beautyType = "FaceUnity" private val timeFormat = SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS", Locale.ROOT) @@ -39,6 +44,11 @@ object LogUtils { private val workerThread = Executors.newSingleThreadExecutor() private var logOutputStream: FileOutputStream? = null + /** + * Set log file path + * + * @param path + */ @JvmStatic fun setLogFilePath(path: String){ if(path.isEmpty()){ @@ -58,6 +68,13 @@ object LogUtils { } + /** + * I + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun i(tag: String, content: String, vararg args: Any) { val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -66,6 +83,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * D + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun d(tag: String, content: String, vararg args: Any) { val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -74,6 +98,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * W + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun w(tag: String, content: String, vararg args: Any){ val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -82,6 +113,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * E + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun e(tag: String, content: String, vararg args: Any){ val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/StatsHelper.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/StatsHelper.kt index cb4cf1292..6f2dacf46 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/StatsHelper.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/StatsHelper.kt @@ -30,6 +30,13 @@ import io.agora.beautyapi.faceunity.BeautyStats import kotlin.math.max import kotlin.math.min +/** + * Stats helper + * + * @property statsDuration + * @property onStatsChanged + * @constructor Create empty Stats helper + */ class StatsHelper( private val statsDuration: Long, private val onStatsChanged: (BeautyStats) -> Unit @@ -41,6 +48,11 @@ class StatsHelper( private var mCostMax = 0L private var mCostMin = Long.MAX_VALUE + /** + * Once + * + * @param cost + */ fun once(cost: Long) { val curr = System.currentTimeMillis() if (mStartTime == 0L) { @@ -68,6 +80,10 @@ class StatsHelper( mCostMin = min(mCostMin, cost) } + /** + * Reset + * + */ fun reset() { mMainHandler.removeCallbacksAndMessages(null) mStartTime = 0 diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/EGLContextHelper.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/EGLContextHelper.java index 97b3c7a53..b3717d609 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/EGLContextHelper.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/EGLContextHelper.java @@ -36,6 +36,9 @@ import io.agora.beautyapi.faceunity.utils.LogUtils; +/** + * The type Egl context helper. + */ public class EGLContextHelper { private static final String DEBUG_TAG = "EGLContextManager"; private final int mRedSize = 8; @@ -45,12 +48,23 @@ public class EGLContextHelper { private final int mDepthSize = 16; private final int mStencilSize = 0; private final int mRenderType = 4; - public EGLContextHelper(){} + /** + * Instantiates a new Egl context helper. + */ + public EGLContextHelper() { + } + + /** + * Init egl. + * + * @param shareContext the share context + * @throws Exception the exception + */ public void initEGL(EGLContext shareContext) throws Exception { mEGL = (EGL10) GLDebugHelper.wrap(EGLContext.getEGL(), GLDebugHelper.CONFIG_CHECK_GL_ERROR - | GLDebugHelper.CONFIG_CHECK_THREAD, null); + | GLDebugHelper.CONFIG_CHECK_THREAD, null); if (mEGL == null) { throw new Exception("Couldn't get EGL"); @@ -69,8 +83,8 @@ public void initEGL(EGLContext shareContext) throws Exception { + curGLVersion[1]); int[] num_config = new int[1]; - if(!mEGL.eglChooseConfig(mGLDisplay, mConfigSpec, null, 1, - num_config)){ + if (!mEGL.eglChooseConfig(mGLDisplay, mConfigSpec, null, 1, + num_config)) { throw new IllegalArgumentException("eglChooseConfig failed"); } int numConfigs = num_config[0]; @@ -115,32 +129,75 @@ public void initEGL(EGLContext shareContext) throws Exception { } + /** + * Gets egl context. + * + * @return the egl context + */ public EGLContext getEGLContext() { return mGLContext; } + /** + * Gets gl display. + * + * @return the gl display + */ public EGLDisplay getGLDisplay() { return mGLDisplay; } + /** + * Gets gl config. + * + * @return the gl config + */ public EGLConfig getGLConfig() { return mGLConfig; } + /** + * Gets gl surface. + * + * @return the gl surface + */ public EGLSurface getGLSurface() { return mGLSurface; } + /** + * Gets egl. + * + * @return the egl + */ public EGL10 getEGL() { return mEGL; } + /** + * The M egl. + */ EGL10 mEGL; + /** + * The M gl display. + */ EGLDisplay mGLDisplay; + /** + * The M gl config. + */ EGLConfig mGLConfig; + /** + * The M gl surface. + */ EGLSurface mGLSurface; + /** + * The M gl context. + */ EGLContext mGLContext; + /** + * The M config spec. + */ int[] mConfigSpec = new int[]{ EGL10.EGL_RED_SIZE, mRedSize, EGL10.EGL_GREEN_SIZE, mGreenSize, @@ -148,9 +205,12 @@ public EGL10 getEGL() { EGL10.EGL_ALPHA_SIZE, mAlphaSize, EGL10.EGL_DEPTH_SIZE, mDepthSize, EGL10.EGL_STENCIL_SIZE, mStencilSize, - EGL10.EGL_RENDERABLE_TYPE, mRenderType,//egl鐗堟湰 2.0 + EGL10.EGL_RENDERABLE_TYPE, mRenderType, //egl鐗堟湰 2.0 EGL10.EGL_NONE}; + /** + * Release. + */ public void release() { mEGL.eglMakeCurrent(mGLDisplay, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_CONTEXT); @@ -161,15 +221,25 @@ public void release() { LogUtils.i(DEBUG_TAG, "GL Cleaned up"); } - public boolean eglMakeCurrent(){ - if(mGLContext == EGL10.EGL_NO_CONTEXT){ + /** + * Egl make current boolean. + * + * @return the boolean + */ + public boolean eglMakeCurrent() { + if (mGLContext == EGL10.EGL_NO_CONTEXT) { return false; - }else{ + } else { return mEGL.eglMakeCurrent(mGLDisplay, mGLSurface, mGLSurface, mGLContext); } } - public boolean eglMakeNoCurrent(){ + /** + * Egl make no current boolean. + * + * @return the boolean + */ + public boolean eglMakeNoCurrent() { return mEGL.eglMakeCurrent(mGLDisplay, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_SURFACE, EGL10.EGL_NO_CONTEXT); } @@ -181,7 +251,7 @@ private EGLConfig chooseConfig(EGL10 egl, EGLDisplay display, EGL10.EGL_DEPTH_SIZE, 0); int s = findConfigAttrib(egl, display, config, EGL10.EGL_STENCIL_SIZE, 0); - if ((d >= mDepthSize) && (s >= mStencilSize)) { + if (d >= mDepthSize && s >= mStencilSize) { int r = findConfigAttrib(egl, display, config, EGL10.EGL_RED_SIZE, 0); int g = findConfigAttrib(egl, display, config, @@ -190,8 +260,8 @@ private EGLConfig chooseConfig(EGL10 egl, EGLDisplay display, EGL10.EGL_BLUE_SIZE, 0); int a = findConfigAttrib(egl, display, config, EGL10.EGL_ALPHA_SIZE, 0); - if ((r == mRedSize) && (g == mGreenSize) - && (b == mBlueSize) && (a == mAlphaSize)) { + if (r == mRedSize && g == mGreenSize + && b == mBlueSize && a == mAlphaSize) { return config; } } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLCopyHelper.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLCopyHelper.java index b475f39d9..6f92d1474 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLCopyHelper.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLCopyHelper.java @@ -28,31 +28,51 @@ import android.opengl.GLES20; import android.opengl.GLES30; +/** + * The type Gl copy helper. + */ public class GLCopyHelper { private final int bufferCount; - public GLCopyHelper(){ + /** + * Instantiates a new Gl copy helper. + */ + public GLCopyHelper() { this(1); } - public GLCopyHelper(int bufferCount){ + /** + * Instantiates a new Gl copy helper. + * + * @param bufferCount the buffer count + */ + public GLCopyHelper(int bufferCount) { this.bufferCount = bufferCount; } private int[] mDstFrameBuffer; private int[] mSrcFrameBuffer; + /** + * Copy 2 d texture to oes texture. + * + * @param srcTexture the src texture + * @param dstTexture the dst texture + * @param width the width + * @param height the height + * @param index the index + */ public void copy2DTextureToOesTexture( int srcTexture, int dstTexture, int width, int height, - int index){ - if(mDstFrameBuffer == null){ + int index) { + if (mDstFrameBuffer == null) { mDstFrameBuffer = new int[bufferCount]; GLES20.glGenFramebuffers(bufferCount, mDstFrameBuffer, 0); } - if(mSrcFrameBuffer == null){ + if (mSrcFrameBuffer == null) { mSrcFrameBuffer = new int[bufferCount]; GLES20.glGenFramebuffers(bufferCount, mSrcFrameBuffer, 0); } @@ -70,13 +90,16 @@ public void copy2DTextureToOesTexture( GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0); } - public void release(){ - if(mDstFrameBuffer != null){ + /** + * Release. + */ + public void release() { + if (mDstFrameBuffer != null) { GLES20.glDeleteFramebuffers(mDstFrameBuffer.length, mDstFrameBuffer, 0); mDstFrameBuffer = null; } - if(mSrcFrameBuffer != null){ + if (mSrcFrameBuffer != null) { GLES20.glDeleteFramebuffers(mSrcFrameBuffer.length, mSrcFrameBuffer, 0); mSrcFrameBuffer = null; } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLFrameBuffer.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLFrameBuffer.java index c2ddc8216..4372c0700 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLFrameBuffer.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLFrameBuffer.java @@ -7,6 +7,9 @@ import io.agora.base.internal.video.GlRectDrawer; import io.agora.base.internal.video.RendererCommon; +/** + * The type Gl frame buffer. + */ public class GLFrameBuffer { private int mFramebufferId = -1; @@ -18,10 +21,20 @@ public class GLFrameBuffer { private float[] mTexMatrix = GLUtils.IDENTITY_MATRIX; + /** + * Instantiates a new Gl frame buffer. + */ public GLFrameBuffer() { } + /** + * Sets size. + * + * @param width the width + * @param height the height + * @return the size + */ public boolean setSize(int width, int height) { if (mWidth != width || mHeight != height) { mWidth = width; @@ -32,36 +45,66 @@ public boolean setSize(int width, int height) { return false; } + /** + * Sets rotation. + * + * @param rotation the rotation + */ public void setRotation(int rotation) { if (mRotation != rotation) { mRotation = rotation; } } + /** + * Sets flip v. + * + * @param flipV the flip v + */ public void setFlipV(boolean flipV) { if (isFlipV != flipV) { isFlipV = flipV; } } + /** + * Sets flip h. + * + * @param flipH the flip h + */ public void setFlipH(boolean flipH) { if (isFlipH != flipH) { isFlipH = flipH; } } - public void setTextureId(int textureId){ - if(mTextureId != textureId){ + /** + * Sets texture id. + * + * @param textureId the texture id + */ + public void setTextureId(int textureId) { + if (mTextureId != textureId) { deleteTexture(); mTextureId = textureId; isTextureChanged = true; } } - public int getTextureId(){ + /** + * Gets texture id. + * + * @return the texture id + */ + public int getTextureId() { return mTextureId; } + /** + * Sets tex matrix. + * + * @param matrix the matrix + */ public void setTexMatrix(float[] matrix) { if (matrix != null) { mTexMatrix = matrix; @@ -70,32 +113,43 @@ public void setTexMatrix(float[] matrix) { } } - public void resetTransform(){ + /** + * Reset transform. + */ + public void resetTransform() { mTexMatrix = GLUtils.IDENTITY_MATRIX; - isFlipH = isFlipV = false; + isFlipH = false; + isFlipV = false; mRotation = 0; } + /** + * Process int. + * + * @param textureId the texture id + * @param textureType the texture type + * @return the int + */ public int process(int textureId, int textureType) { if (mWidth <= 0 && mHeight <= 0) { throw new RuntimeException("setSize firstly!"); } - if(mTextureId == -1){ + if (mTextureId == -1) { mTextureId = createTexture(mWidth, mHeight); bindFramebuffer(mTextureId); isTextureInner = true; - }else if(isTextureInner && isSizeChanged){ + } else if (isTextureInner && isSizeChanged) { GLES20.glDeleteTextures(1, new int[]{mTextureId}, 0); mTextureId = createTexture(mWidth, mHeight); bindFramebuffer(mTextureId); - }else if(isTextureChanged){ + } else if (isTextureChanged) { bindFramebuffer(mTextureId); } isTextureChanged = false; isSizeChanged = false; - if(drawer == null){ + if (drawer == null) { drawer = new GlRectDrawer(); } @@ -106,15 +160,15 @@ public int process(int textureId, int textureType) { transform.preTranslate(0.5f, 0.5f); transform.preRotate(mRotation, 0.f, 0.f); transform.preScale( - isFlipH ? -1.f: 1.f, - isFlipV ? -1.f: 1.f + isFlipH ? -1.f : 1.f, + isFlipV ? -1.f : 1.f ); transform.preTranslate(-0.5f, -0.5f); float[] matrix = RendererCommon.convertMatrixFromAndroidGraphicsMatrix(transform); - if(textureType == GLES11Ext.GL_TEXTURE_EXTERNAL_OES){ + if (textureType == GLES11Ext.GL_TEXTURE_EXTERNAL_OES) { drawer.drawOes(textureId, matrix, mWidth, mHeight, 0, 0, mWidth, mHeight); - }else{ + } else { drawer.drawRgb(textureId, matrix, mWidth, mHeight, 0, 0, mWidth, mHeight); } GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); @@ -123,11 +177,14 @@ public int process(int textureId, int textureType) { return mTextureId; } - public void release(){ + /** + * Release. + */ + public void release() { deleteTexture(); deleteFramebuffer(); - if(drawer != null){ + if (drawer != null) { drawer.release(); drawer = null; } @@ -141,7 +198,14 @@ private void deleteFramebuffer() { } } - public int createTexture(int width, int height){ + /** + * Create texture int. + * + * @param width the width + * @param height the height + * @return the int + */ + public int createTexture(int width, int height) { int[] textures = new int[1]; GLES20.glGenTextures(1, textures, 0); GLUtils.checkGlError("glGenTextures"); @@ -165,6 +229,13 @@ public int createTexture(int width, int height){ return textureId; } + /** + * Resize texture. + * + * @param textureId the texture id + * @param width the width + * @param height the height + */ public void resizeTexture(int textureId, int width, int height) { GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, @@ -181,7 +252,7 @@ private void deleteTexture() { } private void bindFramebuffer(int textureId) { - if(mFramebufferId == -1){ + if (mFramebufferId == -1) { int[] framebuffers = new int[1]; GLES20.glGenFramebuffers(1, framebuffers, 0); GLUtils.checkGlError("glGenFramebuffers"); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLTextureBufferQueue.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLTextureBufferQueue.kt index 19bee8812..2b9ca20b1 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLTextureBufferQueue.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLTextureBufferQueue.kt @@ -29,6 +29,14 @@ import android.util.Log import android.util.Size import java.util.concurrent.ConcurrentLinkedQueue +/** + * G l texture buffer queue + * + * @property glFrameBuffer + * @property cacheCount + * @property loggable + * @constructor Create empty G l texture buffer queue + */ class GLTextureBufferQueue( private val glFrameBuffer: GLFrameBuffer = GLFrameBuffer(), private val cacheCount: Int = 6, @@ -41,6 +49,12 @@ class GLTextureBufferQueue( private val textureIdQueue = ConcurrentLinkedQueue() + /** + * Enqueue + * + * @param iN + * @return + */ fun enqueue(iN: TextureIn): Int { var size = textureIdQueue.size if (size < cacheCount) { @@ -126,6 +140,11 @@ class GLTextureBufferQueue( return size } + /** + * Dequeue + * + * @return + */ fun dequeue(): TextureOut? { val size = textureIdQueue.size val poll = textureIdQueue.poll() @@ -136,11 +155,19 @@ class GLTextureBufferQueue( return poll } + /** + * Reset + * + */ fun reset() { cacheIndex = 0 textureIdQueue.clear() } + /** + * Release + * + */ fun release() { cacheIndex = 0 cacheTextureOuts.forEachIndexed { index, textureOut -> @@ -153,6 +180,21 @@ class GLTextureBufferQueue( glFrameBuffer.release() } + /** + * Texture in + * + * @property textureId + * @property textureType + * @property width + * @property height + * @property rotation + * @property flipV + * @property isFrontCamera + * @property isMirror + * @property transform + * @property tag + * @constructor Create empty Texture in + */ data class TextureIn( val textureId: Int, val textureType: Int, @@ -166,6 +208,18 @@ class GLTextureBufferQueue( val tag: Any? = null ) + /** + * Texture out + * + * @property index + * @property textureId + * @property textureType + * @property width + * @property height + * @property isFrontCamera + * @property tag + * @constructor Create empty Texture out + */ data class TextureOut( var index: Int = 0, val textureId: Int, diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLUtils.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLUtils.java index 071587426..887da3cc6 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/GLUtils.java @@ -44,8 +44,14 @@ import io.agora.beautyapi.faceunity.utils.LogUtils; -public class GLUtils { +/** + * The type Gl utils. + */ +public final class GLUtils { private static final String TAG = "GLUtils"; + /** + * The constant IDENTITY_MATRIX. + */ public static final float[] IDENTITY_MATRIX = new float[16]; static { @@ -55,6 +61,14 @@ public class GLUtils { private GLUtils() { } + /** + * Gets texture 2 d image. + * + * @param textureID the texture id + * @param width the width + * @param height the height + * @return the texture 2 d image + */ public static Bitmap getTexture2DImage(int textureID, int width, int height) { try { int[] oldFboId = new int[1]; @@ -96,6 +110,14 @@ public static Bitmap getTexture2DImage(int textureID, int width, int height) { return null; } + /** + * Gets texture oes image. + * + * @param textureID the texture id + * @param width the width + * @param height the height + * @return the texture oes image + */ public static Bitmap getTextureOESImage(int textureID, int width, int height) { try { int[] oldFboId = new int[1]; @@ -137,6 +159,14 @@ public static Bitmap getTextureOESImage(int textureID, int width, int height) { return null; } + /** + * Nv 21 to bitmap bitmap. + * + * @param nv21 the nv 21 + * @param width the width + * @param height the height + * @return the bitmap + */ public static Bitmap nv21ToBitmap(byte[] nv21, int width, int height) { Bitmap bitmap = null; try { @@ -161,6 +191,14 @@ private static Bitmap readBitmap(int width, int height) { return bitmap; } + /** + * Create transform matrix float [ ]. + * + * @param rotation the rotation + * @param flipH the flip h + * @param flipV the flip v + * @return the float [ ] + */ public static float[] createTransformMatrix(int rotation, boolean flipH, boolean flipV) { float[] renderMVPMatrix = new float[16]; float[] tmp = new float[16]; @@ -193,6 +231,11 @@ public static float[] createTransformMatrix(int rotation, boolean flipH, boolean return renderMVPMatrix; } + /** + * Gets curr gl context. + * + * @return the curr gl context + */ public static EGLContext getCurrGLContext() { EGL10 egl = (EGL10) EGLContext.getEGL(); if (egl != null && !Objects.equals(egl.eglGetCurrentContext(), EGL10.EGL_NO_CONTEXT)) { @@ -201,6 +244,11 @@ public static EGLContext getCurrGLContext() { return null; } + /** + * Check gl error. + * + * @param op the op + */ public static void checkGlError(String op) { int error = GLES20.glGetError(); if (error != GLES20.GL_NO_ERROR) { @@ -210,6 +258,13 @@ public static void checkGlError(String op) { } } + /** + * Create program int. + * + * @param vertexSource the vertex source + * @param fragmentSource the fragment source + * @return the int + */ public static int createProgram(String vertexSource, String fragmentSource) { int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexSource); if (vertexShader == 0) { @@ -240,6 +295,13 @@ public static int createProgram(String vertexSource, String fragmentSource) { return program; } + /** + * Load shader int. + * + * @param shaderType the shader type + * @param source the source + * @return the int + */ public static int loadShader(int shaderType, String source) { int shader = GLES20.glCreateShader(shaderType); checkGlError("glCreateShader type=" + shaderType); @@ -256,6 +318,17 @@ public static int loadShader(int shaderType, String source) { return shader; } + /** + * Create texture int. + * + * @param textureTarget the texture target + * @param bitmap the bitmap + * @param minFilter the min filter + * @param magFilter the mag filter + * @param wrapS the wrap s + * @param wrapT the wrap t + * @return the int + */ public static int createTexture(int textureTarget, Bitmap bitmap, int minFilter, int magFilter, int wrapS, int wrapT) { int[] textureHandle = new int[1]; diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/TextureProcessHelper.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/TextureProcessHelper.kt index 8417aada4..c36491c60 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/TextureProcessHelper.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/faceunity/utils/egl/TextureProcessHelper.kt @@ -33,6 +33,12 @@ import java.util.concurrent.Executors import java.util.concurrent.Future import javax.microedition.khronos.egl.EGLContext +/** + * Texture process helper + * + * @property cacheCount + * @constructor Create empty Texture process helper + */ class TextureProcessHelper( private val cacheCount: Int = 2 ) { @@ -49,10 +55,29 @@ class TextureProcessHelper( private var isBegin = false private var frameIndex = 0 + /** + * Set filter + * + * @param filter + * @receiver + */ fun setFilter(filter: (GLTextureBufferQueue.TextureOut) -> Int) { this.filter = filter } + /** + * Process + * + * @param texId + * @param texType + * @param width + * @param height + * @param rotation + * @param transform + * @param isFrontCamera + * @param isMirror + * @return + */ fun process( texId: Int, texType: Int, width: Int, height: Int, rotation: Int, @@ -159,6 +184,10 @@ class TextureProcessHelper( return ret } + /** + * Reset + * + */ fun reset(){ if(frameIndex == 0){ return @@ -176,8 +205,16 @@ class TextureProcessHelper( } } + /** + * Size + * + */ fun size() = futureQueue.size + /** + * Release + * + */ fun release() { isReleased = true filter = null @@ -199,6 +236,12 @@ class TextureProcessHelper( workerThread.shutdown() } + /** + * Execute sync + * + * @param run + * @receiver + */ fun executeSync(run: () -> Unit) { val latch = CountDownLatch(1) workerThread.execute { diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPI.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPI.kt index 47f31bd7f..517d2f1ea 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPI.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPI.kt @@ -32,13 +32,37 @@ import io.agora.base.VideoFrame import io.agora.rtc2.Constants import io.agora.rtc2.RtcEngine +/** + * Version + */ const val VERSION = "1.0.3" +/** + * Capture mode + * + * @constructor Create empty Capture mode + */ enum class CaptureMode{ + /** + * Agora + * + * @constructor Create empty Agora + */ Agora, // 浣跨敤澹扮綉鍐呴儴鐨勭ゼ鏁版嵁鎺ュ彛杩涜澶勭悊 + + /** + * Custom + * + * @constructor Create empty Custom + */ Custom // 鑷畾涔夋ā寮忥紝闇瑕佽嚜宸辫皟鐢╫nFrame鎺ュ彛灏嗗師濮嬭棰戝抚浼犵粰BeautyAPI鍋氬鐞 } +/** + * I event callback + * + * @constructor Create empty I event callback + */ interface IEventCallback{ /** @@ -49,27 +73,83 @@ interface IEventCallback{ fun onBeautyStats(stats: BeautyStats) } +/** + * Beauty stats + * + * @property minCostMs + * @property maxCostMs + * @property averageCostMs + * @constructor Create empty Beauty stats + */ data class BeautyStats( val minCostMs:Long, // 缁熻鍖洪棿鍐呯殑鏈灏忓 val maxCostMs: Long, // 缁熻鍖洪棿鍐呯殑鏈澶у val averageCostMs: Long // 缁熻鍖洪棿鍐呯殑骞冲潎鍊 ) +/** + * Mirror mode + * + * @constructor Create empty Mirror mode + */ enum class MirrorMode { // 娌℃湁闀滃儚姝e父鐢婚潰鐨勫畾涔夛細鍓嶇疆鎷嶅埌鐢婚潰鍜屾墜鏈虹湅鍒扮敾闈㈡槸宸﹀彸涓嶄竴鑷寸殑锛屽悗缃媿鍒扮敾闈㈠拰鎵嬫満鐪嬪埌鐢婚潰鏄乏鍙充竴鑷寸殑 + /** + * Mirror Local Remote + * + * @constructor Create empty Mirror Local Remote + */ MIRROR_LOCAL_REMOTE, //鏈湴杩滅閮介暅鍍忥紝鍓嶇疆榛樿锛屾湰鍦板拰杩滅璐寸焊閮芥甯 + + /** + * Mirror Local Only + * + * @constructor Create empty Mirror Local Only + */ MIRROR_LOCAL_ONLY, // 浠呮湰鍦伴暅鍍忥紝杩滅涓嶉暅鍍忥紝锛岃繙绔创绾告甯革紝鏈湴璐寸焊闀滃儚銆傜敤浜庢墦鐢佃瘽鍦烘櫙锛岀數鍟嗙洿鎾満鏅(淇濊瘉鐢靛晢鐩存挱鍚庨潰鐨勫憡绀虹墝鏂囧瓧鏄鐨)锛涜繖绉嶆ā寮忓洜涓烘湰鍦拌繙绔槸鍙嶇殑锛屾墍浠ヨ偗瀹氭湁涓杈圭殑鏂囧瓧璐寸焊鏂瑰悜浼氭槸鍙嶇殑 + + /** + * Mirror Remote Only + * + * @constructor Create empty Mirror Remote Only + */ MIRROR_REMOTE_ONLY, // 浠呰繙绔暅鍍忥紝鏈湴涓嶉暅鍍忥紝杩滅璐寸焊姝e父锛屾湰鍦拌创绾搁暅鍍 + + /** + * Mirror None + * + * @constructor Create empty Mirror None + */ MIRROR_NONE // 鏈湴杩滅閮戒笉闀滃儚锛屽悗缃粯璁わ紝鏈湴鍜岃繙绔创绾搁兘姝e父 } +/** + * Camera config + * + * @property frontMirror + * @property backMirror + * @constructor Create empty Camera config + */ data class CameraConfig( val frontMirror: MirrorMode = MirrorMode.MIRROR_LOCAL_REMOTE, // 鍓嶇疆榛樿闀滃儚锛氭湰鍦拌繙绔兘闀滃儚 val backMirror: MirrorMode = MirrorMode.MIRROR_NONE // 鍚庣疆榛樿闀滃儚锛氭湰鍦拌繙绔兘涓嶉暅鍍 ) +/** + * Config + * + * @property context + * @property rtcEngine + * @property stHandlers + * @property eventCallback + * @property captureMode + * @property statsDuration + * @property statsEnable + * @property cameraConfig + * @constructor Create empty Config + */ data class Config( val context: Context, // Android Context涓婁笅鏂 val rtcEngine: RtcEngine, // 澹扮綉Rtc寮曟搸 @@ -81,29 +161,115 @@ data class Config( val cameraConfig: CameraConfig = CameraConfig() // 鎽勫儚澶撮暅鍍忛厤缃 ) +/** + * S t handlers + * + * @property effectNative + * @property humanActionNative + * @constructor Create empty S t handlers + */ data class STHandlers( val effectNative: STMobileEffectNative, val humanActionNative: STMobileHumanActionNative ) +/** + * Error code + * + * @property value + * @constructor Create empty Error code + */ enum class ErrorCode(val value: Int) { + /** + * Error Ok + * + * @constructor Create empty Error Ok + */ ERROR_OK(0), // 涓鍒囨甯 + + /** + * Error Has Not Initialized + * + * @constructor Create empty Error Has Not Initialized + */ ERROR_HAS_NOT_INITIALIZED(101), // 娌℃湁璋冪敤Initialize鎴栬皟鐢ㄥけ璐ユ儏鍐典笅璋冪敤浜嗗叾浠朅PI + + /** + * Error Has Initialized + * + * @constructor Create empty Error Has Initialized + */ ERROR_HAS_INITIALIZED(102), // 宸茬粡Initialize鎴愬姛鍚庡啀娆¤皟鐢ㄦ姤閿 + + /** + * Error Has Released + * + * @constructor Create empty Error Has Released + */ ERROR_HAS_RELEASED(103), // 宸茬粡璋冪敤release閿姣佸悗杩樿皟鐢ㄥ叾浠朅PI + + /** + * Error Process Not Custom + * + * @constructor Create empty Error Process Not Custom + */ ERROR_PROCESS_NOT_CUSTOM(104), // 闈濩ustom澶勭悊妯″紡涓嬭皟鐢╫nFrame鎺ュ彛浠庡閮ㄤ紶鍏ヨ棰戝抚 + + /** + * Error Process Disable + * + * @constructor Create empty Error Process Disable + */ ERROR_PROCESS_DISABLE(105), // 褰撹皟鐢╡nable(false)绂佺敤缇庨鍚庤皟鐢╫nFrame鎺ュ彛杩斿洖 + + /** + * Error View Type Error + * + * @constructor Create empty Error View Type Error + */ ERROR_VIEW_TYPE_ERROR(106), // 褰撹皟鐢╯etupLocalVideo鏃秜iew绫诲瀷閿欒鏃惰繑鍥 + + /** + * Error Frame Skipped + * + * @constructor Create empty Error Frame Skipped + */ ERROR_FRAME_SKIPPED(107), // 褰撳鐞嗗抚蹇界暐鏃跺湪onFrame杩斿洖 } +/** + * Beauty preset + * + * @constructor Create empty Beauty preset + */ enum class BeautyPreset { + /** + * Custom + * + * @constructor Create empty Custom + */ CUSTOM, // 涓嶄娇鐢ㄦ帹鑽愮殑缇庨鍙傛暟 + + /** + * Default + * + * @constructor Create empty Default + */ DEFAULT // 榛樿鐨 } +/** + * Create sense time beauty a p i + * + * @return + */ fun createSenseTimeBeautyAPI(): SenseTimeBeautyAPI = SenseTimeBeautyAPIImpl() +/** + * Sense time beauty a p i + * + * @constructor Create empty Sense time beauty a p i + */ interface SenseTimeBeautyAPI { /** @@ -157,6 +323,11 @@ interface SenseTimeBeautyAPI { */ fun isFrontCamera(): Boolean + /** + * Get mirror applied + * + * @return + */ fun getMirrorApplied(): Boolean /** diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPIImpl.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPIImpl.kt index d27856303..fe056a10f 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPIImpl.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/SenseTimeBeautyAPIImpl.kt @@ -55,6 +55,11 @@ import java.nio.ByteBuffer import java.util.concurrent.Callable import java.util.concurrent.Executors +/** + * Sense time beauty a p i impl + * + * @constructor Create empty Sense time beauty a p i impl + */ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { private val TAG = "SenseTimeBeautyAPIImpl" private val reportId = "scenarioAPI" @@ -78,15 +83,56 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { private var localVideoRenderMode = Constants.RENDER_MODE_HIDDEN private enum class ProcessSourceType{ + /** + * Unknown + * + * @constructor Create empty Unknown + */ UNKNOWN, + + /** + * Texture Oes Api26 + * + * @constructor Create empty Texture Oes Api26 + */ TEXTURE_OES_API26, + + /** + * Texture 2d Api26 + * + * @constructor Create empty Texture 2d Api26 + */ TEXTURE_2D_API26, + + /** + * Texture Oes + * + * @constructor Create empty Texture Oes + */ TEXTURE_OES, + + /** + * Texture 2d + * + * @constructor Create empty Texture 2d + */ TEXTURE_2D, + + /** + * I420 + * + * @constructor Create empty I420 + */ I420, } private var currProcessSourceType = ProcessSourceType.UNKNOWN + /** + * Initialize + * + * @param config + * @return + */ override fun initialize(config: Config): Int { if (this.config != null) { LogUtils.e(TAG, "initialize >> The beauty api has been initialized!") @@ -108,6 +154,12 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Enable + * + * @param enable + * @return + */ override fun enable(enable: Boolean): Int { LogUtils.i(TAG, "enable >> enable = $enable") if (config == null) { @@ -133,6 +185,13 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Setup local video + * + * @param view + * @param renderMode + * @return + */ override fun setupLocalVideo(view: View, renderMode: Int): Int { val rtcEngine = config?.rtcEngine if(rtcEngine == null){ @@ -151,6 +210,12 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_VIEW_TYPE_ERROR.value } + /** + * On frame + * + * @param videoFrame + * @return + */ override fun onFrame(videoFrame: VideoFrame): Int { val conf = config if(conf == null){ @@ -175,6 +240,12 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_FRAME_SKIPPED.value } + /** + * Set beauty preset + * + * @param preset + * @return + */ override fun setBeautyPreset(preset: BeautyPreset): Int { val effectNative = config?.stHandlers?.effectNative if(effectNative == null){ @@ -297,6 +368,12 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Update camera config + * + * @param config + * @return + */ override fun updateCameraConfig(config: CameraConfig): Int { LogUtils.i(TAG, "updateCameraConfig >> oldCameraConfig=$cameraConfig, newCameraConfig=$config") cameraConfig = CameraConfig(config.frontMirror, config.backMirror) @@ -305,14 +382,29 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { return ErrorCode.ERROR_OK.value } + /** + * Is front camera + * + */ override fun isFrontCamera() = isFrontCamera + /** + * Set parameters + * + * @param key + * @param value + */ override fun setParameters(key: String, value: String) { when(key){ "beauty_mode" -> beautyMode = value.toInt() } } + /** + * Release + * + * @return + */ override fun release(): Int { if(config == null){ LogUtils.e(TAG, "release >> The beauty api has not been initialized!") @@ -630,32 +722,79 @@ class SenseTimeBeautyAPIImpl : SenseTimeBeautyAPI, IVideoFrameObserver { // IVideoFrameObserver implements + /** + * On capture video frame + * + * @param sourceType + * @param videoFrame + * @return + */ override fun onCaptureVideoFrame(sourceType: Int, videoFrame: VideoFrame?): Boolean { videoFrame ?: return false return processBeauty(videoFrame) } + /** + * On pre encode video frame + * + * @param sourceType + * @param videoFrame + * @return + */ override fun onPreEncodeVideoFrame(sourceType: Int, videoFrame: VideoFrame?) : Boolean { return true } + /** + * On media player video frame + * + * @param videoFrame + * @param mediaPlayerId + */ override fun onMediaPlayerVideoFrame(videoFrame: VideoFrame?, mediaPlayerId: Int) = false + /** + * On render video frame + * + * @param channelId + * @param uid + * @param videoFrame + */ override fun onRenderVideoFrame( channelId: String?, uid: Int, videoFrame: VideoFrame? ) = false + /** + * Get video frame process mode + * + */ override fun getVideoFrameProcessMode() = IVideoFrameObserver.PROCESS_MODE_READ_WRITE + /** + * Get video format preference + * + */ override fun getVideoFormatPreference() = IVideoFrameObserver.VIDEO_PIXEL_DEFAULT + /** + * Get rotation applied + * + */ override fun getRotationApplied() = false + /** + * Get mirror applied + * + */ override fun getMirrorApplied() = captureMirror && !enable + /** + * Get observed frame position + * + */ override fun getObservedFramePosition() = IVideoFrameObserver.POSITION_POST_CAPTURER } \ No newline at end of file diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/LogUtils.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/LogUtils.kt index b02cc6ecf..c73922732 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/LogUtils.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/LogUtils.kt @@ -32,6 +32,11 @@ import java.util.Date import java.util.Locale import java.util.concurrent.Executors +/** + * Log utils + * + * @constructor Create empty Log utils + */ object LogUtils { private const val beautyType = "SenseTime" private val timeFormat = SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS", Locale.ROOT) @@ -39,6 +44,11 @@ object LogUtils { private val workerThread = Executors.newSingleThreadExecutor() private var logOutputStream: FileOutputStream? = null + /** + * Set log file path + * + * @param path + */ @JvmStatic fun setLogFilePath(path: String){ if(path.isEmpty()){ @@ -58,6 +68,13 @@ object LogUtils { } + /** + * I + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun i(tag: String, content: String, vararg args: Any) { val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -66,6 +83,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * D + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun d(tag: String, content: String, vararg args: Any) { val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -74,6 +98,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * W + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun w(tag: String, content: String, vararg args: Any){ val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" @@ -82,6 +113,13 @@ object LogUtils { saveToFile(fileMessage) } + /** + * E + * + * @param tag + * @param content + * @param args + */ @JvmStatic fun e(tag: String, content: String, vararg args: Any){ val consoleMessage = "[BeautyAPI][$beautyType] : ${String.format(content, args)}" diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/StatsHelper.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/StatsHelper.kt index 7391003ae..748a8919d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/StatsHelper.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/StatsHelper.kt @@ -30,6 +30,13 @@ import io.agora.beautyapi.sensetime.BeautyStats import kotlin.math.max import kotlin.math.min +/** + * Stats helper + * + * @property statsDuration + * @property onStatsChanged + * @constructor Create empty Stats helper + */ class StatsHelper( private val statsDuration: Long, private val onStatsChanged: (BeautyStats) -> Unit @@ -41,6 +48,11 @@ class StatsHelper( private var mCostMax = 0L private var mCostMin = Long.MAX_VALUE + /** + * Once + * + * @param cost + */ fun once(cost: Long) { val curr = System.currentTimeMillis() if (mStartTime == 0L) { @@ -68,6 +80,10 @@ class StatsHelper( mCostMin = min(mCostMin, cost) } + /** + * Reset + * + */ fun reset() { mMainHandler.removeCallbacksAndMessages(null) mStartTime = 0 diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLCopyHelper.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLCopyHelper.java index f939bd62e..2385613ed 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLCopyHelper.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLCopyHelper.java @@ -4,31 +4,51 @@ import android.opengl.GLES20; import android.opengl.GLES30; +/** + * The type Gl copy helper. + */ public class GLCopyHelper { private final int bufferCount; - public GLCopyHelper(){ + /** + * Instantiates a new Gl copy helper. + */ + public GLCopyHelper() { this(1); } - public GLCopyHelper(int bufferCount){ + /** + * Instantiates a new Gl copy helper. + * + * @param bufferCount the buffer count + */ + public GLCopyHelper(int bufferCount) { this.bufferCount = bufferCount; } private int[] mDstFrameBuffer; private int[] mSrcFrameBuffer; + /** + * Copy 2 d texture to oes texture. + * + * @param srcTexture the src texture + * @param dstTexture the dst texture + * @param width the width + * @param height the height + * @param index the index + */ public void copy2DTextureToOesTexture( int srcTexture, int dstTexture, int width, int height, - int index){ - if(mDstFrameBuffer == null){ + int index) { + if (mDstFrameBuffer == null) { mDstFrameBuffer = new int[bufferCount]; GLES20.glGenFramebuffers(bufferCount, mDstFrameBuffer, 0); } - if(mSrcFrameBuffer == null){ + if (mSrcFrameBuffer == null) { mSrcFrameBuffer = new int[bufferCount]; GLES20.glGenFramebuffers(bufferCount, mSrcFrameBuffer, 0); } @@ -46,13 +66,16 @@ public void copy2DTextureToOesTexture( GLES30.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, 0); } - public void release(){ - if(mDstFrameBuffer != null){ + /** + * Release. + */ + public void release() { + if (mDstFrameBuffer != null) { GLES20.glDeleteFramebuffers(mDstFrameBuffer.length, mDstFrameBuffer, 0); mDstFrameBuffer = null; } - if(mSrcFrameBuffer != null){ + if (mSrcFrameBuffer != null) { GLES20.glDeleteFramebuffers(mSrcFrameBuffer.length, mSrcFrameBuffer, 0); mSrcFrameBuffer = null; } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLFrameBuffer.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLFrameBuffer.java index 0eb431533..b9ae7a1e5 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLFrameBuffer.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLFrameBuffer.java @@ -7,6 +7,9 @@ import io.agora.base.internal.video.GlRectDrawer; import io.agora.base.internal.video.RendererCommon; +/** + * The type Gl frame buffer. + */ public class GLFrameBuffer { private int mFramebufferId = -1; @@ -18,10 +21,20 @@ public class GLFrameBuffer { private float[] mTexMatrix = GlUtil.IDENTITY_MATRIX; + /** + * Instantiates a new Gl frame buffer. + */ public GLFrameBuffer() { } + /** + * Sets size. + * + * @param width the width + * @param height the height + * @return the size + */ public boolean setSize(int width, int height) { if (mWidth != width || mHeight != height) { mWidth = width; @@ -32,36 +45,66 @@ public boolean setSize(int width, int height) { return false; } + /** + * Sets rotation. + * + * @param rotation the rotation + */ public void setRotation(int rotation) { if (mRotation != rotation) { mRotation = rotation; } } + /** + * Sets flip v. + * + * @param flipV the flip v + */ public void setFlipV(boolean flipV) { if (isFlipV != flipV) { isFlipV = flipV; } } + /** + * Sets flip h. + * + * @param flipH the flip h + */ public void setFlipH(boolean flipH) { if (isFlipH != flipH) { isFlipH = flipH; } } - public void setTextureId(int textureId){ - if(mTextureId != textureId){ + /** + * Sets texture id. + * + * @param textureId the texture id + */ + public void setTextureId(int textureId) { + if (mTextureId != textureId) { deleteTexture(); mTextureId = textureId; isTextureChanged = true; } } - public int getTextureId(){ + /** + * Gets texture id. + * + * @return the texture id + */ + public int getTextureId() { return mTextureId; } + /** + * Sets tex matrix. + * + * @param matrix the matrix + */ public void setTexMatrix(float[] matrix) { if (matrix != null) { mTexMatrix = matrix; @@ -70,32 +113,43 @@ public void setTexMatrix(float[] matrix) { } } - public void resetTransform(){ + /** + * Reset transform. + */ + public void resetTransform() { mTexMatrix = GlUtil.IDENTITY_MATRIX; - isFlipH = isFlipV = false; + isFlipH = false; + isFlipV = false; mRotation = 0; } + /** + * Process int. + * + * @param textureId the texture id + * @param textureType the texture type + * @return the int + */ public int process(int textureId, int textureType) { if (mWidth <= 0 && mHeight <= 0) { throw new RuntimeException("setSize firstly!"); } - if(mTextureId == -1){ + if (mTextureId == -1) { mTextureId = createTexture(mWidth, mHeight); bindFramebuffer(mTextureId); isTextureInner = true; - }else if(isTextureInner && isSizeChanged){ + } else if (isTextureInner && isSizeChanged) { GLES20.glDeleteTextures(1, new int[]{mTextureId}, 0); mTextureId = createTexture(mWidth, mHeight); bindFramebuffer(mTextureId); - }else if(isTextureChanged){ + } else if (isTextureChanged) { bindFramebuffer(mTextureId); } isTextureChanged = false; isSizeChanged = false; - if(drawer == null){ + if (drawer == null) { drawer = new GlRectDrawer(); } @@ -106,15 +160,15 @@ public int process(int textureId, int textureType) { transform.preTranslate(0.5f, 0.5f); transform.preRotate(mRotation, 0.f, 0.f); transform.preScale( - isFlipH ? -1.f: 1.f, - isFlipV ? -1.f: 1.f + isFlipH ? -1.f : 1.f, + isFlipV ? -1.f : 1.f ); transform.preTranslate(-0.5f, -0.5f); float[] matrix = RendererCommon.convertMatrixFromAndroidGraphicsMatrix(transform); - if(textureType == GLES11Ext.GL_TEXTURE_EXTERNAL_OES){ + if (textureType == GLES11Ext.GL_TEXTURE_EXTERNAL_OES) { drawer.drawOes(textureId, matrix, mWidth, mHeight, 0, 0, mWidth, mHeight); - }else{ + } else { drawer.drawRgb(textureId, matrix, mWidth, mHeight, 0, 0, mWidth, mHeight); } GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); @@ -123,11 +177,14 @@ public int process(int textureId, int textureType) { return mTextureId; } - public void release(){ + /** + * Release. + */ + public void release() { deleteTexture(); deleteFramebuffer(); - if(drawer != null){ + if (drawer != null) { drawer.release(); drawer = null; } @@ -141,7 +198,14 @@ private void deleteFramebuffer() { } } - public int createTexture(int width, int height){ + /** + * Create texture int. + * + * @param width the width + * @param height the height + * @return the int + */ + public int createTexture(int width, int height) { int[] textures = new int[1]; GLES20.glGenTextures(1, textures, 0); GlUtil.checkGlError("glGenTextures"); @@ -165,6 +229,13 @@ public int createTexture(int width, int height){ return textureId; } + /** + * Resize texture. + * + * @param textureId the texture id + * @param width the width + * @param height the height + */ public void resizeTexture(int textureId, int width, int height) { GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId); GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, width, height, 0, @@ -181,7 +252,7 @@ private void deleteTexture() { } private void bindFramebuffer(int textureId) { - if(mFramebufferId == -1){ + if (mFramebufferId == -1) { int[] framebuffers = new int[1]; GLES20.glGenFramebuffers(1, framebuffers, 0); GlUtil.checkGlError("glGenFramebuffers"); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTestUtils.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTestUtils.java index 67f65cad1..bfacdf7cd 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTestUtils.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTestUtils.java @@ -39,9 +39,24 @@ import io.agora.beautyapi.sensetime.utils.LogUtils; -public class GLTestUtils { +/** + * The type Gl test utils. + */ +public final class GLTestUtils { private static final String TAG = "GLUtils"; + private GLTestUtils() { + + } + + /** + * Gets texture 2 d image. + * + * @param textureID the texture id + * @param width the width + * @param height the height + * @return the texture 2 d image + */ public static Bitmap getTexture2DImage(int textureID, int width, int height) { try { int[] oldFboId = new int[1]; @@ -83,6 +98,14 @@ public static Bitmap getTexture2DImage(int textureID, int width, int height) { return null; } + /** + * Gets texture oes image. + * + * @param textureID the texture id + * @param width the width + * @param height the height + * @return the texture oes image + */ public static Bitmap getTextureOESImage(int textureID, int width, int height) { try { int[] oldFboId = new int[1]; @@ -124,6 +147,14 @@ public static Bitmap getTextureOESImage(int textureID, int width, int height) { return null; } + /** + * Nv 21 to bitmap bitmap. + * + * @param nv21 the nv 21 + * @param width the width + * @param height the height + * @return the bitmap + */ public static Bitmap nv21ToBitmap(byte[] nv21, int width, int height) { Bitmap bitmap = null; try { @@ -138,7 +169,7 @@ public static Bitmap nv21ToBitmap(byte[] nv21, int width, int height) { return bitmap; } - private static Bitmap readBitmap(int width, int height){ + private static Bitmap readBitmap(int width, int height) { ByteBuffer rgbaBuf = ByteBuffer.allocateDirect(width * height * 4); rgbaBuf.position(0); GLES20.glReadPixels(0, 0, width, height, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, rgbaBuf); diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTextureBufferQueue.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTextureBufferQueue.kt index fee26684b..bf973bbb4 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTextureBufferQueue.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GLTextureBufferQueue.kt @@ -29,6 +29,13 @@ import android.util.Size import io.agora.beautyapi.sensetime.utils.LogUtils import java.util.concurrent.ConcurrentLinkedQueue +/** + * G l texture buffer queue + * + * @property glFrameBuffer + * @property cacheCount + * @constructor Create empty G l texture buffer queue + */ class GLTextureBufferQueue( private val glFrameBuffer: GLFrameBuffer, private val cacheCount: Int = 6 @@ -40,6 +47,12 @@ class GLTextureBufferQueue( private val textureIdQueue = ConcurrentLinkedQueue() + /** + * Enqueue + * + * @param iN + * @return + */ fun enqueue(iN: TextureIn): Int { var size = textureIdQueue.size if (size < cacheCount) { @@ -116,19 +129,36 @@ class GLTextureBufferQueue( return size } + /** + * Dequeue + * + * @return + */ fun dequeue(): TextureOut? { val size = textureIdQueue.size val poll = textureIdQueue.poll() return poll } + /** + * Size + * + */ fun size() = textureIdQueue.size + /** + * Reset + * + */ fun reset() { cacheIndex = 0 textureIdQueue.clear() } + /** + * Release + * + */ fun release() { cacheIndex = 0 cacheTextureOuts.forEachIndexed { index, textureOut -> @@ -140,6 +170,19 @@ class GLTextureBufferQueue( textureIdQueue.clear() } + /** + * Texture in + * + * @property textureId + * @property textureType + * @property width + * @property height + * @property rotation + * @property isFrontCamera + * @property isMirror + * @property transform + * @constructor Create empty Texture in + */ data class TextureIn( val textureId: Int, val textureType: Int, @@ -151,6 +194,17 @@ class GLTextureBufferQueue( val transform: FloatArray? ) + /** + * Texture out + * + * @property index + * @property textureId + * @property textureType + * @property width + * @property height + * @property isFrontCamera + * @constructor Create empty Texture out + */ data class TextureOut( var index: Int = 0, val textureId: Int, diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GlUtil.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GlUtil.java index 41c1d24e3..ceab345ea 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GlUtil.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/egl/GlUtil.java @@ -47,15 +47,24 @@ import io.agora.beautyapi.sensetime.utils.LogUtils; -public class GlUtil { +/** + * The type Gl util. + */ +public final class GlUtil { private static final String TAG = "GlUtil"; - /** Identity matrix for general use. Don't modify or life will get weird. */ + /** + * Identity matrix for general use. Don't modify or life will get weird. + */ public static final int NO_TEXTURE = -1; private static final int SIZEOF_FLOAT = 4; + /** + * The constant IDENTITY_MATRIX. + */ public static final float[] IDENTITY_MATRIX = new float[16]; + static { Matrix.setIdentityM(IDENTITY_MATRIX, 0); } @@ -63,6 +72,14 @@ public class GlUtil { private GlUtil() { // do not instantiate } + /** + * Create program int. + * + * @param applicationContext the application context + * @param vertexSourceRawId the vertex source raw id + * @param fragmentSourceRawId the fragment source raw id + * @return the int + */ public static int createProgram(Context applicationContext, @RawRes int vertexSourceRawId, @RawRes int fragmentSourceRawId) { @@ -72,6 +89,13 @@ public static int createProgram(Context applicationContext, @RawRes int vertexSo return createProgram(vertexSource, fragmentSource); } + /** + * Create program int. + * + * @param vertexSource the vertex source + * @param fragmentSource the fragment source + * @return the int + */ public static int createProgram(String vertexSource, String fragmentSource) { int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexSource); if (vertexShader == 0) { @@ -102,6 +126,13 @@ public static int createProgram(String vertexSource, String fragmentSource) { return program; } + /** + * Load shader int. + * + * @param shaderType the shader type + * @param source the source + * @return the int + */ public static int loadShader(int shaderType, String source) { int shader = GLES20.glCreateShader(shaderType); checkGlError("glCreateShader type=" + shaderType); @@ -118,6 +149,17 @@ public static int loadShader(int shaderType, String source) { return shader; } + /** + * Create texture int. + * + * @param textureTarget the texture target + * @param bitmap the bitmap + * @param minFilter the min filter + * @param magFilter the mag filter + * @param wrapS the wrap s + * @param wrapT the wrap t + * @return the int + */ public static int createTexture(int textureTarget, @Nullable Bitmap bitmap, int minFilter, int magFilter, int wrapS, int wrapT) { int[] textureHandle = new int[1]; @@ -139,16 +181,37 @@ public static int createTexture(int textureTarget, @Nullable Bitmap bitmap, int return textureHandle[0]; } + /** + * Create texture int. + * + * @param textureTarget the texture target + * @return the int + */ public static int createTexture(int textureTarget) { return createTexture(textureTarget, null, GLES20.GL_LINEAR, GLES20.GL_LINEAR, GLES20.GL_CLAMP_TO_EDGE, GLES20.GL_CLAMP_TO_EDGE); } + /** + * Create texture int. + * + * @param textureTarget the texture target + * @param bitmap the bitmap + * @return the int + */ public static int createTexture(int textureTarget, Bitmap bitmap) { return createTexture(textureTarget, bitmap, GLES20.GL_LINEAR, GLES20.GL_LINEAR, GLES20.GL_CLAMP_TO_EDGE, GLES20.GL_CLAMP_TO_EDGE); } + /** + * Init effect texture. + * + * @param width the width + * @param height the height + * @param textureId the texture id + * @param type the type + */ public static void initEffectTexture(int width, int height, int[] textureId, int type) { int len = textureId.length; if (len > 0) { @@ -168,8 +231,11 @@ public static void initEffectTexture(int width, int height, int[] textureId, int GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null); } } + /** * Checks to see if a GLES error has been raised. + * + * @param op the op */ public static void checkGlError(String op) { int error = GLES20.glGetError(); @@ -182,6 +248,9 @@ public static void checkGlError(String op) { /** * Allocates a direct float buffer, and populates it with the float array data. + * + * @param coords the coords + * @return the float buffer */ public static FloatBuffer createFloatBuffer(float[] coords) { // Allocate a direct ByteBuffer, using 4 bytes per float, and copy coords into it. @@ -193,6 +262,13 @@ public static FloatBuffer createFloatBuffer(float[] coords) { return fb; } + /** + * Read text from raw resource string. + * + * @param applicationContext the application context + * @param resourceId the resource id + * @return the string + */ public static String readTextFromRawResource(final Context applicationContext, @RawRes final int resourceId) { final InputStream inputStream = @@ -213,14 +289,22 @@ public static String readTextFromRawResource(final Context applicationContext, return body.toString(); } - public static float[] createTransformMatrix(int rotation, boolean flipH, boolean flipV){ + /** + * Create transform matrix float [ ]. + * + * @param rotation the rotation + * @param flipH the flip h + * @param flipV the flip v + * @return the float [ ] + */ + public static float[] createTransformMatrix(int rotation, boolean flipH, boolean flipV) { float[] renderMVPMatrix = new float[16]; float[] tmp = new float[16]; Matrix.setIdentityM(tmp, 0); boolean _flipH = flipH; boolean _flipV = flipV; - if(rotation % 180 != 0){ + if (rotation % 180 != 0) { _flipH = flipV; _flipV = flipH; } @@ -234,7 +318,7 @@ public static float[] createTransformMatrix(int rotation, boolean flipH, boolean float _rotation = rotation; if (_rotation != 0) { - if(_flipH != _flipV){ + if (_flipH != _flipV) { _rotation *= -1; } Matrix.rotateM(tmp, 0, tmp, 0, _rotation, 0, 0, 1); @@ -245,8 +329,13 @@ public static float[] createTransformMatrix(int rotation, boolean flipH, boolean return renderMVPMatrix; } - public static EGLContext getCurrGLContext(){ - EGL10 egl = (EGL10)EGLContext.getEGL(); + /** + * Gets curr gl context. + * + * @return the curr gl context + */ + public static EGLContext getCurrGLContext() { + EGL10 egl = (EGL10) EGLContext.getEGL(); if (egl != null && !Objects.equals(egl.eglGetCurrentContext(), EGL10.EGL_NO_CONTEXT)) { return egl.eglGetCurrentContext(); } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/Accelerometer.java b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/Accelerometer.java index fa772e63d..6e0b04f1d 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/Accelerometer.java +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/Accelerometer.java @@ -30,8 +30,12 @@ import android.hardware.SensorEventListener; import android.hardware.SensorManager; +/** + * The type Accelerometer. + */ public class Accelerometer { /** + * The enum Clockwise angle. * * @author MatrixCV * @@ -57,12 +61,34 @@ public class Accelerometer { * |+---------+| * |_____O_____| */ - public enum CLOCKWISE_ANGLE { - Deg0(0), Deg90(1), Deg180(2), Deg270(3); + public enum ClockwiseAngle { + /** + * Deg 0 clockwise angle. + */ + Deg0(0), + /** + * Deg 90 clockwise angle. + */ + Deg90(1), + /** + * Deg 180 clockwise angle. + */ + Deg180(2), + /** + * Deg 270 clockwise angle. + */ + Deg270(3); private int value; - private CLOCKWISE_ANGLE(int value){ + + ClockwiseAngle(int value) { this.value = value; } + + /** + * Gets value. + * + * @return the value + */ public int getValue() { return value; } @@ -72,28 +98,30 @@ public int getValue() { private boolean hasStarted = false; - private CLOCKWISE_ANGLE rotation; + private ClockwiseAngle rotation; private SensorEvent sensorEvent; /** + * Instantiates a new Accelerometer. * - * @param ctx - * 鐢ˋctivity鍒濆鍖栬幏寰椾紶鎰熷櫒 + * @param ctx 鐢ˋctivity鍒濆鍖栬幏寰椾紶鎰熷櫒 */ public Accelerometer(Context ctx) { sensorManager = (SensorManager) ctx .getSystemService(Context.SENSOR_SERVICE); - rotation = CLOCKWISE_ANGLE.Deg90; + rotation = ClockwiseAngle.Deg90; } /** * 寮濮嬪浼犳劅鍣ㄧ殑鐩戝惉 */ public void start() { - if (hasStarted) return; + if (hasStarted) { + return; + } hasStarted = true; - rotation = CLOCKWISE_ANGLE.Deg90; + rotation = ClockwiseAngle.Deg90; sensorManager.registerListener(accListener, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL); @@ -103,20 +131,27 @@ public void start() { * 缁撴潫瀵逛紶鎰熷櫒鐨勭洃鍚 */ public void stop() { - if (!hasStarted) return; + if (!hasStarted) { + return; + } hasStarted = false; sensorManager.unregisterListener(accListener); } /** + * Gets direction. * - * @return - * 杩斿洖褰撳墠鎵嬫満杞悜 + * @return 杩斿洖褰撳墠鎵嬫満杞悜 direction */ public int getDirection() { return rotation.getValue(); } + /** + * Gets sensor event. + * + * @return the sensor event + */ public SensorEvent getSensorEvent() { return sensorEvent; } @@ -135,19 +170,18 @@ public void onSensorChanged(SensorEvent arg0) { if (arg0.sensor.getType() == Sensor.TYPE_ACCELEROMETER) { float x = arg0.values[0]; float y = arg0.values[1]; - float z = arg0.values[2]; - if (Math.abs(x)>3 || Math.abs(y)>3) { - if (Math.abs(x)> Math.abs(y)) { + if (Math.abs(x) > 3 || Math.abs(y) > 3) { + if (Math.abs(x) > Math.abs(y)) { if (x > 0) { - rotation = CLOCKWISE_ANGLE.Deg0; + rotation = ClockwiseAngle.Deg0; } else { - rotation = CLOCKWISE_ANGLE.Deg180; + rotation = ClockwiseAngle.Deg180; } } else { if (y > 0) { - rotation = CLOCKWISE_ANGLE.Deg90; + rotation = ClockwiseAngle.Deg90; } else { - rotation = CLOCKWISE_ANGLE.Deg270; + rotation = ClockwiseAngle.Deg270; } } } diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/BeautyProcessor.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/BeautyProcessor.kt index bf8919d38..cfadc9996 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/BeautyProcessor.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/BeautyProcessor.kt @@ -20,8 +20,13 @@ import io.agora.beautyapi.sensetime.utils.LogUtils import io.agora.beautyapi.sensetime.utils.egl.GLCopyHelper import io.agora.beautyapi.sensetime.utils.egl.GLFrameBuffer import io.agora.beautyapi.sensetime.utils.egl.GLTextureBufferQueue -import io.agora.beautyapi.sensetime.utils.processor.Accelerometer.CLOCKWISE_ANGLE +import io.agora.beautyapi.sensetime.utils.processor.Accelerometer.ClockwiseAngle +/** + * Beauty processor + * + * @constructor Create empty Beauty processor + */ class BeautyProcessor : IBeautyProcessor { private val TAG = this::class.java.simpleName @@ -48,6 +53,12 @@ class BeautyProcessor : IBeautyProcessor { @Volatile private var isReleased = false + /** + * Initialize + * + * @param effectNative + * @param humanActionNative + */ override fun initialize( effectNative: STMobileEffectNative, humanActionNative: STMobileHumanActionNative @@ -56,6 +67,10 @@ class BeautyProcessor : IBeautyProcessor { mFaceDetector = FaceDetector(humanActionNative, effectNative) } + /** + * Release + * + */ override fun release() { isReleased = true mFaceDetector.release() @@ -82,10 +97,21 @@ class BeautyProcessor : IBeautyProcessor { mSTMobileHardwareBufferNative = null } + /** + * Enable sensor + * + * @param context + * @param enable + */ override fun enableSensor(context: Context, enable: Boolean) { mFaceDetector.enableSensor(context, enable) } + /** + * Trigger screen tap + * + * @param isDouble + */ override fun triggerScreenTap(isDouble: Boolean) { LogUtils.d( TAG, @@ -100,6 +126,12 @@ class BeautyProcessor : IBeautyProcessor { } + /** + * Process + * + * @param input + * @return + */ override fun process(input: InputInfo): OutputInfo? { if (isReleased) { return null @@ -423,6 +455,10 @@ class BeautyProcessor : IBeautyProcessor { return finalOutTextureId } + /** + * Reset + * + */ override fun reset() { mFaceDetector.reset() glTextureBufferQueue.reset() @@ -438,7 +474,7 @@ class BeautyProcessor : IBeautyProcessor { private fun getCurrentOrientation(): Int { - val dir = mFaceDetector.getAccelerometer()?.direction ?: CLOCKWISE_ANGLE.Deg90.value + val dir = mFaceDetector.getAccelerometer()?.direction ?: ClockwiseAngle.Deg90.value var orientation = dir - 1 if (orientation < 0) { orientation = dir xor 3 diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/FaceDetector.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/FaceDetector.kt index f48f361cd..7c6620f16 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/FaceDetector.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/FaceDetector.kt @@ -38,6 +38,13 @@ import java.util.concurrent.ConcurrentLinkedQueue import java.util.concurrent.Executors import java.util.concurrent.Future +/** + * Face detector + * + * @property humanActionNative + * @property effectNative + * @constructor Create empty Face detector + */ class FaceDetector( private val humanActionNative: STMobileHumanActionNative, private val effectNative: STMobileEffectNative @@ -52,6 +59,12 @@ class FaceDetector( private val cacheFutureQueue = ConcurrentLinkedQueue>() private var isDequeBegin = false + /** + * Enable sensor + * + * @param context + * @param enable + */ fun enableSensor(context: Context, enable: Boolean) { if (enable) { if (accelerometer == null) { @@ -67,8 +80,16 @@ class FaceDetector( } } + /** + * Get accelerometer + * + */ fun getAccelerometer() = accelerometer + /** + * Reset + * + */ fun reset() { cacheIndex = 0 isDequeBegin = false @@ -79,12 +100,22 @@ class FaceDetector( } } + /** + * Release + * + */ fun release(){ reset() accelerometer?.stop() workerThread.shutdownNow() } + /** + * Enqueue + * + * @param iN + * @return + */ fun enqueue(iN: DetectorIn): Int { val index = cacheIndex val size = cacheFutureQueue.size @@ -102,6 +133,11 @@ class FaceDetector( return size } + /** + * Dequeue + * + * @return + */ fun dequeue(): DetectorOut? { val size = cacheFutureQueue.size if(isDequeBegin || size >= cacheSize){ @@ -121,6 +157,10 @@ class FaceDetector( return null } + /** + * Size + * + */ fun size() = cacheFutureQueue.size private fun detectHuman(iN: DetectorIn, index: Int) { @@ -130,7 +170,7 @@ class FaceDetector( iN.orientation ) val deviceOrientation: Int = - accelerometer?.direction ?: Accelerometer.CLOCKWISE_ANGLE.Deg90.value + accelerometer?.direction ?: Accelerometer.ClockwiseAngle.Deg90.value val ret: Int = humanActionNative.nativeHumanActionDetectPtr( iN.bytes, iN.bytesType, @@ -169,7 +209,7 @@ class FaceDetector( */ private fun getHumanActionOrientation(frontCamera: Boolean, cameraRotation: Int): Int { //鑾峰彇閲嶅姏浼犳劅鍣ㄨ繑鍥炵殑鏂瑰悜 - var orientation: Int = accelerometer?.direction ?: Accelerometer.CLOCKWISE_ANGLE.Deg90.value + var orientation: Int = accelerometer?.direction ?: Accelerometer.ClockwiseAngle.Deg90.value //鍦ㄤ娇鐢ㄥ悗缃憚鍍忓ご锛屼笖浼犳劅鍣ㄦ柟鍚戜负0鎴2鏃讹紝鍚庣疆鎽勫儚澶翠笌鍓嶇疆orientation鐩稿弽 if (!frontCamera && orientation == STRotateType.ST_CLOCKWISE_ROTATE_0) { @@ -189,6 +229,18 @@ class FaceDetector( } + /** + * Detector in + * + * @property bytes + * @property bytesType + * @property width + * @property height + * @property isFront + * @property isMirror + * @property orientation + * @constructor Create empty Detector in + */ data class DetectorIn( val bytes: ByteArray, val bytesType: Int, @@ -199,6 +251,13 @@ class FaceDetector( val orientation: Int ) + /** + * Detector out + * + * @property humanResult + * @property animalResult + * @constructor Create empty Detector out + */ data class DetectorOut( val humanResult: Long, val animalResult: STMobileAnimalResult? = null diff --git a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/IBeautyProcessor.kt b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/IBeautyProcessor.kt index deea9fe26..73b979fb7 100644 --- a/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/IBeautyProcessor.kt +++ b/Android/APIExample/app/src/main/java/io/agora/beautyapi/sensetime/utils/processor/IBeautyProcessor.kt @@ -6,6 +6,23 @@ import com.softsugar.stmobile.STCommonNative import com.softsugar.stmobile.STMobileEffectNative import com.softsugar.stmobile.STMobileHumanActionNative +/** + * Input info + * + * @property bytes + * @property bytesType + * @property textureId + * @property textureType + * @property textureMatrix + * @property diffBetweenBytesAndTexture + * @property width + * @property height + * @property isFrontCamera + * @property isMirror + * @property cameraOrientation + * @property timestamp + * @constructor Create empty Input info + */ data class InputInfo( val bytes: ByteArray? = null, val bytesType: Int = STCommonNative.ST_PIX_FMT_NV21, @@ -22,6 +39,18 @@ data class InputInfo( ) +/** + * Output info + * + * @property textureId + * @property textureType + * @property width + * @property height + * @property timestamp + * @property errorCode + * @property errorMessage + * @constructor Create empty Output info + */ class OutputInfo( val textureId: Int = 0, val textureType: Int = GLES20.GL_TEXTURE_2D, @@ -32,23 +61,64 @@ class OutputInfo( val errorMessage: String = "" ) +/** + * I beauty processor + * + * @constructor Create empty I beauty processor + */ interface IBeautyProcessor { + /** + * Initialize + * + * @param effectNative + * @param humanActionNative + */ fun initialize( effectNative: STMobileEffectNative, // 缇庨鏁堟灉澶勭悊鍙ユ焺 humanActionNative: STMobileHumanActionNative // 浜鸿劯妫娴嬪彞鏌 ) + /** + * Process + * + * @param input + * @return + */ fun process(input: InputInfo): OutputInfo? + /** + * Enable sensor + * + * @param context + * @param enable + */ fun enableSensor(context: Context, enable: Boolean) + /** + * Trigger screen tap + * + * @param isDouble + */ fun triggerScreenTap(isDouble: Boolean) + /** + * Reset + * + */ fun reset() - + + /** + * Release + * + */ fun release() } +/** + * Create beauty processor + * + * @return + */ fun createBeautyProcessor(): IBeautyProcessor = BeautyProcessor() \ No newline at end of file diff --git a/Android/APIExample/app/src/main/res/layout/fragment_face_capture.xml b/Android/APIExample/app/src/main/res/layout/fragment_face_capture.xml new file mode 100644 index 000000000..de7b4c4ee --- /dev/null +++ b/Android/APIExample/app/src/main/res/layout/fragment_face_capture.xml @@ -0,0 +1,59 @@ + + + + + + + + + + + + + + + + + + + + + diff --git a/Android/APIExample/app/src/main/res/layout/fragment_media_player.xml b/Android/APIExample/app/src/main/res/layout/fragment_media_player.xml index 3ca2e0448..2f8ea7b18 100644 --- a/Android/APIExample/app/src/main/res/layout/fragment_media_player.xml +++ b/Android/APIExample/app/src/main/res/layout/fragment_media_player.xml @@ -15,19 +15,73 @@ + android:layout_height="wrap_content" + android:layout_weight="1.0" /> + android:layout_height="wrap_content" + android:layout_marginTop="1dp" + android:layout_weight="1.0" /> + + + + + + + + + + + + + + + + + + + + + + + + android:text="@string/play" + android:textSize="10sp" /> + android:text="@string/stop" + android:textSize="10sp" /> + android:text="@string/pause" + android:textSize="10sp" /> + android:text="@string/publish" + android:textSize="10sp" /> @@ -83,10 +137,10 @@ android:id="@+id/ctrl_progress_bar" android:layout_width="match_parent" android:layout_height="wrap_content" - android:layout_marginBottom="50dp" + android:layout_alignBottom="@id/link_box" android:layout_marginStart="16dp" android:layout_marginEnd="16dp" - android:layout_alignBottom="@id/link_box" /> + android:layout_marginBottom="50dp" /> + android:inputType="text|textUri" + android:singleLine="true" /> + android:digits="@string/chanel_support_char" + android:hint="@string/channel_id" /> @@ -57,7 +56,7 @@ android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_alignParentTop="true" - android:text="Echo Pretest" /> + android:text="Audio Echo Pretest" /> + + + + + + + + + + diff --git a/Android/APIExample/app/src/main/res/layout/fragment_spatial_sound.xml b/Android/APIExample/app/src/main/res/layout/fragment_spatial_sound.xml index a925cec4c..a016b69c9 100644 --- a/Android/APIExample/app/src/main/res/layout/fragment_spatial_sound.xml +++ b/Android/APIExample/app/src/main/res/layout/fragment_spatial_sound.xml @@ -66,8 +66,8 @@ + + + diff --git a/Android/APIExample/app/src/main/res/values-zh/strings.xml b/Android/APIExample/app/src/main/res/values-zh/strings.xml index 93657c4c0..a6a49886d 100644 --- a/Android/APIExample/app/src/main/res/values-zh/strings.xml +++ b/Android/APIExample/app/src/main/res/values-zh/strings.xml @@ -124,6 +124,7 @@ KTV鐗堟潈闊充箰 鏈湴/杩滅褰曞埗 鏈湴鍚堝浘 + 鑾峰彇闈㈡崟鏁版嵁 姝ょず渚嬫紨绀轰簡濡備綍浣跨敤SDK鍔犲叆棰戦亾杩涜绾闊抽氳瘽鐨勫姛鑳姐 姝ょず渚嬫紨绀轰簡濡備綍浣跨敤SDK鍔犲叆棰戦亾杩涜闊宠棰戦氳瘽鐨勫姛鑳姐 @@ -191,6 +192,7 @@ 姝ょず渚嬫紨绀轰簡濡備綍鍦ㄧ敾涓敾涓娇鐢╯dk杩涜杩滅瑙嗛鏄剧ず鐨勫姛鑳姐 姝ょず渚嬫紨绀轰簡濡備綍闆嗘垚绗笁鏂圭編棰渟dk 姝ょず渚嬫紨绀哄浣曚娇鐢╧tv api瀹炵幇涓涓満鏅痙emo + 姝ょず渚嬫紨绀哄浣曚娇鐢ㄩ潰鎹曞姛鑳借幏鍙栭潰鎹曟暟鎹 鏈湴 SDK澶勭悊鍓 @@ -329,10 +331,17 @@ 榛樿 鎵0鍣 鍚瓛 - 鑰虫満(闈濼ypeC) + 鑰虫満 鑰虫満(TypeC) 钃濈墮鑰虫満 绗笁鏂规挱鏀惧櫒 褰撳墠璁惧涓嶆敮鎸侊紒 鍨墖 + 鑷畾涔 + 璇峰厛鍙傝僐EADME閰嶇疆stream encrypt! + 鎾斁闊宠建 + 鎺ㄩ侀煶杞 + 闊宠建%d + 璇疯仈绯诲0缃戝鏈嶈幏鍙栭潰鎹曡瘉涔﹀苟閰嶇疆鍒癆UTHENTICATION甯搁噺涓 + 璇锋墦寮閫氱煡鏉冮檺锛岄槻姝㈠悗鍙板綍闊充腑鏂 \ No newline at end of file diff --git a/Android/APIExample/app/src/main/res/values/arrays.xml b/Android/APIExample/app/src/main/res/values/arrays.xml index 018fd1473..d7588640a 100644 --- a/Android/APIExample/app/src/main/res/values/arrays.xml +++ b/Android/APIExample/app/src/main/res/values/arrays.xml @@ -183,6 +183,7 @@ AES_256_GCM AES_128_GCM2 AES_256_GCM2 + @string/custom VD_120x120 @@ -261,7 +262,6 @@ @string/audio_route_speakerphone @string/audio_route_earpiece @string/audio_route_headset - @string/audio_route_headset_typec @string/audio_route_headset_bluetooth diff --git a/Android/APIExample/app/src/main/res/values/strings.xml b/Android/APIExample/app/src/main/res/values/strings.xml index 7f918b177..0989f4ffb 100644 --- a/Android/APIExample/app/src/main/res/values/strings.xml +++ b/Android/APIExample/app/src/main/res/values/strings.xml @@ -130,6 +130,7 @@ KTV Copyright Music Local/Remote MediaRecorder LocalVideoTranscoding + Face Capture This example demonstrates how to use the SDK to join channels for voice only calls. This example demonstrates how to use the SDK to join channels for audio and video calls. @@ -198,6 +199,7 @@ This example demonstrates how to show remove video in Picture In Picture mode. This example demonstrates how to integrate third-party beauty sdk. This example demonstrates how to using ktv api to implement a scene demo. + This example demonstrates how to use face capture function. PlayOut PreProcess @@ -345,7 +347,7 @@ default speakerphone earpiece - headset(Not TypeC) + headset headset(TypeC) bluetooth headset Third Party Player @@ -354,4 +356,11 @@ NativePlayer The feature is unavailable in the device! Video Image + Custom + Please refer to README to config stream encrypt firstly. + Player Stream + Publish Stream + AudioStream%d + Please contact Shengwang customer service to obtain the face capture certificate and configure it to the AUTHENTICATION constant. + Please turn on notification permission to prevent background recording from being interrupted. diff --git a/Android/APIExample/build.gradle b/Android/APIExample/build.gradle index 2709ee1d0..18b43da42 100644 --- a/Android/APIExample/build.gradle +++ b/Android/APIExample/build.gradle @@ -3,6 +3,12 @@ plugins { id 'com.android.application' version '7.2.0' apply false id 'com.android.library' version '7.2.0' apply false id 'org.jetbrains.kotlin.android' version '1.7.20' apply false + id "io.gitlab.arturbosch.detekt" version "1.23.1" apply true +} + +allprojects { + apply from: "${rootDir.absolutePath}/checkstyle.gradle" + apply from: "${rootDir.absolutePath}/detekt.gradle" } task clean(type: Delete) { diff --git a/Android/APIExample/checkstyle.gradle b/Android/APIExample/checkstyle.gradle new file mode 100644 index 000000000..e2c7b02cc --- /dev/null +++ b/Android/APIExample/checkstyle.gradle @@ -0,0 +1,55 @@ +apply plugin: 'checkstyle' + +checkstyle { + toolVersion = '10.12.3' +} + +def filterCheckstyleFiles(String diffFiles){ + ArrayList filterList = new ArrayList(); + String [] files = diffFiles.split("\\n") + for (String file : files) { + if (file.endsWith(".java") + || file.endsWith(".xml") + || file.endsWith(".properties") + ) { + filterList.add(file) + } + } + return filterList +} + +task checkstyle(type: Checkstyle) { + source 'src' + + exclude '**/gen/**' + exclude '**/test/**' + exclude '**/androidTest/**' + exclude '**/R.java' + exclude '**/BuildConfig.java' + exclude '**/authpack.java' + + if (project.hasProperty('commit_diff_files')) { + def ft = filterCheckstyleFiles(project.property('commit_diff_files')) + if (ft.size() > 0) { + for (int i = 0; i < ft.size(); i++) { + String splitter = ft[i]; + String[] fileName = splitter.split("/"); + include '**/' + fileName[fileName.size() - 1]; + } + } else { + include 'null' + } + + println("checkstyle >> check commit diff files...") + println("checkstyle >> " + includes.toList()) + } else { + include '**/*.java' + println("checkstyle >> check all java files...") + } + + configFile new File(rootDir, "checkstyle.xml") + classpath = files() +} + + + diff --git a/Android/APIExample/checkstyle.xml b/Android/APIExample/checkstyle.xml new file mode 100644 index 000000000..11f45dd03 --- /dev/null +++ b/Android/APIExample/checkstyle.xml @@ -0,0 +1,230 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/Android/APIExample/detekt-baseline.xml b/Android/APIExample/detekt-baseline.xml new file mode 100644 index 000000000..c45c2dd23 --- /dev/null +++ b/Android/APIExample/detekt-baseline.xml @@ -0,0 +1,8 @@ + + + + VariableNaming:val TAG + MagicNumber:FaceUnityBeautyAPIImpl.kt + + + diff --git a/Android/APIExample/detekt-config.yml b/Android/APIExample/detekt-config.yml new file mode 100644 index 000000000..edffed1f3 --- /dev/null +++ b/Android/APIExample/detekt-config.yml @@ -0,0 +1,751 @@ +build: + excludeCorrectable: false + +config: + validation: true + warningsAsErrors: false + checkExhaustiveness: false + # when writing own rules with new properties, exclude the property path e.g.: 'my_rule_set,.*>.*>[my_property]' + excludes: '' + +processors: + active: true + exclude: + - 'DetektProgressListener' + # - 'KtFileCountProcessor' + # - 'PackageCountProcessor' + # - 'ClassCountProcessor' + # - 'FunctionCountProcessor' + # - 'PropertyCountProcessor' + # - 'ProjectComplexityProcessor' + # - 'ProjectCognitiveComplexityProcessor' + # - 'ProjectLLOCProcessor' + # - 'ProjectCLOCProcessor' + # - 'ProjectLOCProcessor' + # - 'ProjectSLOCProcessor' + # - 'LicenseHeaderLoaderExtension' + +console-reports: + active: true + exclude: + - 'ProjectStatisticsReport' + - 'ComplexityReport' + - 'NotificationReport' + - 'FindingsReport' + - 'FileBasedFindingsReport' + # - 'LiteFindingsReport' + +output-reports: + active: true + exclude: + # - 'TxtOutputReport' + # - 'XmlOutputReport' + # - 'HtmlOutputReport' + # - 'MdOutputReport' + # - 'SarifOutputReport' + +comments: + active: true + AbsentOrWrongFileLicense: + active: false + licenseTemplateFile: 'license.template' + licenseTemplateIsRegex: false + CommentOverPrivateFunction: + active: false + CommentOverPrivateProperty: + active: false + DeprecatedBlockTag: + active: false + EndOfSentenceFormat: + active: false + endOfSentenceFormat: '([.?!][ \t\n\r\f<])|([.?!:]$)' + KDocReferencesNonPublicProperty: + active: false + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + OutdatedDocumentation: + active: false + matchTypeParameters: true + matchDeclarationsOrder: true + allowParamOnConstructorProperties: false + UndocumentedPublicClass: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + searchInNestedClass: true + searchInInnerClass: true + searchInInnerObject: true + searchInInnerInterface: true + searchInProtectedClass: false + UndocumentedPublicFunction: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + searchProtectedFunction: false + UndocumentedPublicProperty: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + searchProtectedProperty: false + +complexity: + active: true + CognitiveComplexMethod: + active: false + ComplexCondition: + active: true + threshold: 12 + ComplexInterface: + active: false + includeStaticDeclarations: false + includePrivateDeclarations: false + ignoreOverloaded: false + CyclomaticComplexMethod: + active: true + ignoreSingleWhenExpression: false + ignoreSimpleWhenEntries: false + ignoreNestingFunctions: false + threshold: 80 + nestingFunctions: + - 'also' + - 'apply' + - 'forEach' + - 'isNotNull' + - 'ifNull' + - 'let' + - 'run' + - 'use' + - 'with' + LabeledExpression: + active: false + ignoredLabels: [] + LargeClass: + active: true + threshold: 10000 + LongMethod: + active: true + threshold: 300 + LongParameterList: + active: false + ignoreDefaultParameters: false + ignoreDataClasses: true + ignoreAnnotatedParameter: [] + MethodOverloading: + active: false + NamedArguments: + active: false + ignoreArgumentsMatchingNames: false + NestedBlockDepth: + active: true + threshold: 30 + NestedScopeFunctions: + active: false + functions: + - 'kotlin.apply' + - 'kotlin.run' + - 'kotlin.with' + - 'kotlin.let' + - 'kotlin.also' + ReplaceSafeCallChainWithRun: + active: false + StringLiteralDuplication: + active: false + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + ignoreAnnotation: true + ignoreStringsRegex: '$^' + TooManyFunctions: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + ignoreDeprecated: false + ignorePrivate: false + ignoreOverridden: false + thresholdInFiles: 80 + thresholdInClasses: 80 + thresholdInInterfaces: 80 + thresholdInObjects: 80 + thresholdInEnums: 80 + +coroutines: + active: true + GlobalCoroutineUsage: + active: false + InjectDispatcher: + active: true + dispatcherNames: + - 'IO' + - 'Default' + - 'Unconfined' + RedundantSuspendModifier: + active: true + SleepInsteadOfDelay: + active: true + SuspendFunSwallowedCancellation: + active: false + SuspendFunWithCoroutineScopeReceiver: + active: false + SuspendFunWithFlowReturnType: + active: true + +empty-blocks: + active: true + EmptyCatchBlock: + active: true + allowedExceptionNameRegex: '_|(ignore|expected).*' + EmptyClassBlock: + active: true + EmptyDefaultConstructor: + active: true + EmptyDoWhileBlock: + active: true + EmptyElseBlock: + active: true + EmptyFinallyBlock: + active: true + EmptyForBlock: + active: true + EmptyFunctionBlock: + active: true + ignoreOverridden: true + EmptyIfBlock: + active: true + EmptyInitBlock: + active: true + EmptySecondaryConstructor: + active: true + EmptyTryBlock: + active: true + EmptyWhenBlock: + active: true + EmptyWhileBlock: + active: true + +exceptions: + active: true + ExceptionRaisedInUnexpectedLocation: + active: true + methodNames: + - 'equals' + - 'finalize' + - 'hashCode' + - 'toString' + InstanceOfCheckForException: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + NotImplementedDeclaration: + active: false + ObjectExtendsThrowable: + active: false + PrintStackTrace: + active: true + RethrowCaughtException: + active: true + ReturnFromFinally: + active: true + ignoreLabeled: false + SwallowedException: + active: false + ignoredExceptionTypes: + - 'InterruptedException' + - 'MalformedURLException' + - 'NumberFormatException' + - 'ParseException' + allowedExceptionNameRegex: '_|(ignore|expected).*' + ThrowingExceptionFromFinally: + active: true + ThrowingExceptionInMain: + active: false + ThrowingExceptionsWithoutMessageOrCause: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + exceptions: + - 'ArrayIndexOutOfBoundsException' + - 'Exception' + - 'IllegalArgumentException' + - 'IllegalMonitorStateException' + - 'IllegalStateException' + - 'IndexOutOfBoundsException' + - 'NullPointerException' + - 'RuntimeException' + - 'Throwable' + ThrowingNewInstanceOfSameException: + active: true + TooGenericExceptionCaught: + active: false + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + exceptionNames: + - 'ArrayIndexOutOfBoundsException' + - 'Error' + - 'Exception' + - 'IllegalMonitorStateException' + - 'IndexOutOfBoundsException' + - 'NullPointerException' + - 'RuntimeException' + - 'Throwable' + allowedExceptionNameRegex: '_|(ignore|expected).*' + TooGenericExceptionThrown: + active: false + exceptionNames: + - 'Error' + - 'Exception' + - 'RuntimeException' + - 'Throwable' + +naming: + active: true + BooleanPropertyNaming: + active: false + allowedPattern: '^(is|has|are)' + ClassNaming: + active: true + classPattern: '[A-Z][a-zA-Z0-9]*' + ConstructorParameterNaming: + active: true + parameterPattern: '[A-Za-z][A-Za-z0-9]*' + privateParameterPattern: '[a-z][A-Za-z0-9]*' + excludeClassPattern: '$^' + EnumNaming: + active: true + enumEntryPattern: '[A-Za-z][_a-zA-Z0-9]*' + ForbiddenClassName: + active: false + forbiddenName: [] + FunctionNaming: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + functionPattern: '[a-z][a-zA-Z0-9]*' + excludeClassPattern: '$^' + FunctionParameterNaming: + active: true + parameterPattern: '[A-Za-z][A-Za-z0-9]*' + excludeClassPattern: '$^' + InvalidPackageDeclaration: + active: true + rootPackage: '' + requireRootInDeclaration: false + LambdaParameterNaming: + active: false + parameterPattern: '[a-z][A-Za-z0-9]*|_' + MatchingDeclarationName: + active: true + mustBeFirst: true + MemberNameEqualsClassName: + active: true + ignoreOverridden: true + NoNameShadowing: + active: true + NonBooleanPropertyPrefixedWithIs: + active: false + ObjectPropertyNaming: + active: true + constantPattern: '[A-Za-z][_A-Za-z0-9]*' + propertyPattern: '[A-Za-z][_A-Za-z0-9]*' + privatePropertyPattern: '(_)?[A-Za-z][_A-Za-z0-9]*' + PackageNaming: + active: true + packagePattern: '[a-z]+(\.[a-z][A-Za-z0-9]*)*' + TopLevelPropertyNaming: + active: true + constantPattern: '[A-Z][_A-Z0-9]*' + propertyPattern: '[A-Za-z][_A-Za-z0-9]*' + privatePropertyPattern: '_?[A-Za-z][_A-Za-z0-9]*' + VariableMaxLength: + active: false + maximumVariableNameLength: 64 + VariableMinLength: + active: false + minimumVariableNameLength: 1 + VariableNaming: + active: true + variablePattern: '(_)?[A-Za-z][_A-Za-z0-9]*' + privateVariablePattern: '(_)?[A-Za-z][_A-Za-z0-9]*' + excludeClassPattern: '$^' + +performance: + active: true + ArrayPrimitive: + active: true + CouldBeSequence: + active: false + ForEachOnRange: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + SpreadOperator: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + UnnecessaryPartOfBinaryExpression: + active: false + UnnecessaryTemporaryInstantiation: + active: true + +potential-bugs: + active: true + AvoidReferentialEquality: + active: true + forbiddenTypePatterns: + - 'kotlin.String' + CastNullableToNonNullableType: + active: false + CastToNullableType: + active: false + Deprecation: + active: false + DontDowncastCollectionTypes: + active: false + DoubleMutabilityForCollection: + active: true + mutableTypes: + - 'kotlin.collections.MutableList' + - 'kotlin.collections.MutableMap' + - 'kotlin.collections.MutableSet' + - 'java.util.ArrayList' + - 'java.util.LinkedHashSet' + - 'java.util.HashSet' + - 'java.util.LinkedHashMap' + - 'java.util.HashMap' + ElseCaseInsteadOfExhaustiveWhen: + active: false + ignoredSubjectTypes: [] + EqualsAlwaysReturnsTrueOrFalse: + active: true + EqualsWithHashCodeExist: + active: true + ExitOutsideMain: + active: false + ExplicitGarbageCollectionCall: + active: true + HasPlatformType: + active: true + IgnoredReturnValue: + active: true + restrictToConfig: true + returnValueAnnotations: + - 'CheckResult' + - '*.CheckResult' + - 'CheckReturnValue' + - '*.CheckReturnValue' + ignoreReturnValueAnnotations: + - 'CanIgnoreReturnValue' + - '*.CanIgnoreReturnValue' + returnValueTypes: + - 'kotlin.sequences.Sequence' + - 'kotlinx.coroutines.flow.*Flow' + - 'java.util.stream.*Stream' + ignoreFunctionCall: [] + ImplicitDefaultLocale: + active: true + ImplicitUnitReturnType: + active: false + allowExplicitReturnType: true + InvalidRange: + active: true + IteratorHasNextCallsNextMethod: + active: true + IteratorNotThrowingNoSuchElementException: + active: true + LateinitUsage: + active: false + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + ignoreOnClassesPattern: '' + MapGetWithNotNullAssertionOperator: + active: true + MissingPackageDeclaration: + active: false + excludes: ['**/*.kts'] + NullCheckOnMutableProperty: + active: false + NullableToStringCall: + active: false + PropertyUsedBeforeDeclaration: + active: false + UnconditionalJumpStatementInLoop: + active: false + UnnecessaryNotNullCheck: + active: false + UnnecessaryNotNullOperator: + active: true + UnnecessarySafeCall: + active: true + UnreachableCatchBlock: + active: true + UnreachableCode: + active: true + UnsafeCallOnNullableType: + active: true + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**'] + UnsafeCast: + active: true + UnusedUnaryOperator: + active: true + UselessPostfixExpression: + active: true + WrongEqualsTypeParameter: + active: true + +style: + active: true + AlsoCouldBeApply: + active: false + BracesOnIfStatements: + active: false + singleLine: 'never' + multiLine: 'always' + BracesOnWhenStatements: + active: false + singleLine: 'necessary' + multiLine: 'consistent' + CanBeNonNullable: + active: false + CascadingCallWrapping: + active: false + includeElvis: true + ClassOrdering: + active: false + CollapsibleIfStatements: + active: false + DataClassContainsFunctions: + active: false + conversionFunctionPrefix: + - 'to' + allowOperators: false + DataClassShouldBeImmutable: + active: false + DestructuringDeclarationWithTooManyEntries: + active: true + maxDestructuringEntries: 3 + DoubleNegativeLambda: + active: false + negativeFunctions: + - reason: 'Use `takeIf` instead.' + value: 'takeUnless' + - reason: 'Use `all` instead.' + value: 'none' + negativeFunctionNameParts: + - 'not' + - 'non' + EqualsNullCall: + active: true + EqualsOnSignatureLine: + active: false + ExplicitCollectionElementAccessMethod: + active: false + ExplicitItLambdaParameter: + active: true + ExpressionBodySyntax: + active: false + includeLineWrapping: false + ForbiddenAnnotation: + active: false + annotations: + - reason: 'it is a java annotation. Use `Suppress` instead.' + value: 'java.lang.SuppressWarnings' + - reason: 'it is a java annotation. Use `kotlin.Deprecated` instead.' + value: 'java.lang.Deprecated' + - reason: 'it is a java annotation. Use `kotlin.annotation.MustBeDocumented` instead.' + value: 'java.lang.annotation.Documented' + - reason: 'it is a java annotation. Use `kotlin.annotation.Target` instead.' + value: 'java.lang.annotation.Target' + - reason: 'it is a java annotation. Use `kotlin.annotation.Retention` instead.' + value: 'java.lang.annotation.Retention' + - reason: 'it is a java annotation. Use `kotlin.annotation.Repeatable` instead.' + value: 'java.lang.annotation.Repeatable' + - reason: 'Kotlin does not support @Inherited annotation, see https://youtrack.jetbrains.com/issue/KT-22265' + value: 'java.lang.annotation.Inherited' + ForbiddenComment: + active: true + comments: + - reason: 'Forbidden FIXME todo marker in comment, please fix the problem.' + value: 'FIXME:' + - reason: 'Forbidden STOPSHIP todo marker in comment, please address the problem before shipping the code.' + value: 'STOPSHIP:' + - reason: 'Forbidden TODO todo marker in comment, please do the changes.' + value: 'TODO:' + allowedPatterns: '' + ForbiddenImport: + active: false + imports: [] + forbiddenPatterns: '' + ForbiddenMethodCall: + active: false + methods: + - reason: 'print does not allow you to configure the output stream. Use a logger instead.' + value: 'kotlin.io.print' + - reason: 'println does not allow you to configure the output stream. Use a logger instead.' + value: 'kotlin.io.println' + ForbiddenSuppress: + active: false + rules: [] + ForbiddenVoid: + active: true + ignoreOverridden: false + ignoreUsageInGenerics: false + FunctionOnlyReturningConstant: + active: true + ignoreOverridableFunction: true + ignoreActualFunction: true + excludedFunctions: [] + LoopWithTooManyJumpStatements: + active: true + maxJumpCount: 3 + MagicNumber: + active: false + excludes: ['**/test/**', '**/androidTest/**', '**/commonTest/**', '**/jvmTest/**', '**/androidUnitTest/**', '**/androidInstrumentedTest/**', '**/jsTest/**', '**/iosTest/**', '**/*.kts', '**/beautyapi/**'] + ignoreNumbers: + - '-1' + - '0' + - '1' + - '2' + - '100' + ignoreHashCodeFunction: true + ignorePropertyDeclaration: true + ignoreLocalVariableDeclaration: true + ignoreConstantDeclaration: true + ignoreCompanionObjectPropertyDeclaration: true + ignoreAnnotation: true + ignoreNamedArgument: true + ignoreEnums: true + ignoreRanges: true + ignoreExtensionFunctions: true + MandatoryBracesLoops: + active: false + MaxChainedCallsOnSameLine: + active: false + maxChainedCalls: 5 + MaxLineLength: + active: true + maxLineLength: 220 + excludePackageStatements: true + excludeImportStatements: true + excludeCommentStatements: false + excludeRawStrings: true + ModifierOrder: + active: true + MultilineLambdaItParameter: + active: false + MultilineRawStringIndentation: + active: false + indentSize: 4 + trimmingMethods: + - 'trimIndent' + - 'trimMargin' + NestedClassesVisibility: + active: true + NewLineAtEndOfFile: + active: false + NoTabs: + active: false + NullableBooleanCheck: + active: false + ObjectLiteralToLambda: + active: true + OptionalAbstractKeyword: + active: true + OptionalUnit: + active: false + PreferToOverPairSyntax: + active: false + ProtectedMemberInFinalClass: + active: true + RedundantExplicitType: + active: false + RedundantHigherOrderMapUsage: + active: true + RedundantVisibilityModifierRule: + active: false + ReturnCount: + active: true + max: 12 + excludedFunctions: + - 'equals' + excludeLabeled: false + excludeReturnFromLambda: true + excludeGuardClauses: false + SafeCast: + active: true + SerialVersionUIDInSerializableClass: + active: true + StringShouldBeRawString: + active: false + maxEscapedCharacterCount: 2 + ignoredCharacters: [] + ThrowsCount: + active: true + max: 12 + excludeGuardClauses: false + TrailingWhitespace: + active: false + TrimMultilineRawString: + active: false + trimmingMethods: + - 'trimIndent' + - 'trimMargin' + UnderscoresInNumericLiterals: + active: false + acceptableLength: 4 + allowNonStandardGrouping: false + UnnecessaryAnnotationUseSiteTarget: + active: false + UnnecessaryApply: + active: true + UnnecessaryBackticks: + active: false + UnnecessaryBracesAroundTrailingLambda: + active: false + UnnecessaryFilter: + active: true + UnnecessaryInheritance: + active: true + UnnecessaryInnerClass: + active: false + UnnecessaryLet: + active: false + UnnecessaryParentheses: + active: false + allowForUnclearPrecedence: false + UnusedParameter: + active: true + allowedNames: 'ignored|expected' + UnusedPrivateClass: + active: true + UnusedPrivateMember: + active: true + allowedNames: '' + UnusedPrivateProperty: + active: true + allowedNames: '_|ignored|expected|serialVersionUID' + UseAnyOrNoneInsteadOfFind: + active: true + UseArrayLiteralsInAnnotations: + active: true + UseCheckNotNull: + active: true + UseCheckOrError: + active: true + UseDataClass: + active: false + allowVars: false + UseEmptyCounterpart: + active: false + UseIfEmptyOrIfBlank: + active: false + UseIfInsteadOfWhen: + active: false + ignoreWhenContainingVariableDeclaration: false + UseIsNullOrEmpty: + active: true + UseLet: + active: false + UseOrEmpty: + active: true + UseRequire: + active: true + UseRequireNotNull: + active: true + UseSumOfInsteadOfFlatMapSize: + active: false + UselessCallOnNotNull: + active: true + UtilityClassWithPublicConstructor: + active: true + VarCouldBeVal: + active: true + ignoreLateinitVar: false + WildcardImport: + active: true + excludeImports: + - 'java.util.*' \ No newline at end of file diff --git a/Android/APIExample/detekt.gradle b/Android/APIExample/detekt.gradle new file mode 100644 index 000000000..5b591f446 --- /dev/null +++ b/Android/APIExample/detekt.gradle @@ -0,0 +1,55 @@ +apply plugin: "io.gitlab.arturbosch.detekt" + + +def filterDetektFiles(String diffFiles){ + ArrayList filterList = new ArrayList(); + String [] files = diffFiles.split("\\n") + for (String file : files) { + if (file.endsWith(".kt")) { + filterList.add(file) + } + } + return filterList +} + +detekt { + + toolVersion = "1.23.1" + + // Builds the AST in parallel. Rules are always executed in parallel. + // Can lead to speedups in larger projects. `false` by default. + parallel = true + + // Define the detekt configuration(s) you want to use. + // Defaults to the default detekt configuration. + config.setFrom("${rootDir.absolutePath}/detekt-config.yml") + + // Specifying a baseline file. All findings stored in this file in subsequent runs of detekt. + baseline = file("${rootDir.absolutePath}/detekt-baseline.xml") + + if (project.hasProperty('commit_diff_files')) { + def ft = filterDetektFiles(project.property('commit_diff_files')) + source.from = files() + for (int i = 0; i < ft.size(); i++) { + String splitter = ft[i]; + String[] fileName = splitter.split("/") + int srcIndex = 0; + for (final def name in fileName) { + if(name == "src"){ + break + } + srcIndex += name.length() + 1 + } + source.from += splitter.substring(srcIndex) + } + println("detekt >> check commit diff files...") + println("detekt >> " + source.from.toList()) + } else { + println("detekt >> check all kt files...") + } +} + +dependencies { + detektPlugins "io.gitlab.arturbosch.detekt:detekt-formatting:1.23.1" +} + diff --git a/Android/APIExample/git-hooks.gradle b/Android/APIExample/git-hooks.gradle new file mode 100644 index 000000000..b714e0a38 --- /dev/null +++ b/Android/APIExample/git-hooks.gradle @@ -0,0 +1,5 @@ +task installGitHooks(type: Copy){ + from "${rootDir.absolutePath}/../../.githooks/pre-commit" + into "${rootDir.absolutePath}/../../.git/hooks" +} +preBuild.dependsOn installGitHooks \ No newline at end of file diff --git a/Android/APIExample/gradle.properties b/Android/APIExample/gradle.properties index d7c1d04b5..0aa5dd64c 100644 --- a/Android/APIExample/gradle.properties +++ b/Android/APIExample/gradle.properties @@ -20,8 +20,5 @@ android.enableJetifier=true # read enable simple filter section on README first before set this flag to TRUE simpleFilter = false - -# read enable beauty section on README first before set this flag to TRUE -beauty_sensetime = false -beauty_faceunity = false -beauty_bytedance = false \ No newline at end of file +# read enable stream encrypt section on README first before set this flag to TRUE +streamEncrypt = false diff --git a/Android/APIExample/settings.gradle b/Android/APIExample/settings.gradle index f96eab9ce..dfb7230de 100644 --- a/Android/APIExample/settings.gradle +++ b/Android/APIExample/settings.gradle @@ -21,4 +21,7 @@ rootProject.name='APIExample' include ':app' if (simpleFilter.toBoolean()) { include ':agora-simple-filter' +} +if (streamEncrypt.toBoolean()) { + include ':agora-stream-encrypt' } \ No newline at end of file diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/AudioMixing/AudioMixing.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/AudioMixing/AudioMixing.swift index de67e3efd..e2f533a9d 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/AudioMixing/AudioMixing.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/AudioMixing/AudioMixing.swift @@ -152,8 +152,8 @@ class AudioMixingMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -312,8 +312,8 @@ extension AudioMixingMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift index 289323637..d4bbcb631 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift @@ -95,8 +95,8 @@ class CustomAudioRenderMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -130,8 +130,8 @@ extension CustomAudioRenderMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift index 877ab62ac..3c15032fc 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift @@ -92,8 +92,8 @@ class CustomAudioSourceMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -128,8 +128,8 @@ extension CustomAudioSourceMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift index c7a67743e..ce7da158b 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift @@ -105,8 +105,8 @@ class CustomPcmAudioSourceMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -166,8 +166,8 @@ extension CustomPcmAudioSourceMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/PrecallTest/PrecallTest.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/PrecallTest/PrecallTest.swift index fa91bff3f..919c1d6e8 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/PrecallTest/PrecallTest.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/PrecallTest/PrecallTest.swift @@ -56,11 +56,15 @@ class PrecallTestEntry : BaseViewController @IBAction func doEchoTest(sender: UIButton) { - let ret = agoraKit.startEchoTest(withInterval: 10, successBlock: nil) + let testConfig = AgoraEchoTestConfiguration() + testConfig.intervalInSeconds = 10 + testConfig.enableAudio = true + testConfig.channelId = "AudioEchoTest" + "\(Int.random(in: 1...1000))" + let ret = agoraKit.startEchoTest(withConfig: testConfig) if ret != 0 { // for errors please take a look at: - // CN https://docs.agora.io/cn/Video/API%20Reference/oc/Classes/AgoraRtcEngineKit.html#//api/name/enableEncryption:encryptionConfig: - // EN https://docs.agora.io/en/video-calling/develop/media-stream-encryption#implement--media-stream-encryption + // CN https://doc.shengwang.cn/api-ref/rtc/ios/error-code: + // EN https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode showAlert(title: "Error", message: "startEchoTest call failed: \(ret), please check your params") } showPopover(isValidate: false, seconds: 10) {[unowned self] in diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RawAudioData/RawAudioData.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RawAudioData/RawAudioData.swift index 2e2a9c5ee..abc76f7db 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RawAudioData/RawAudioData.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RawAudioData/RawAudioData.swift @@ -42,8 +42,8 @@ class RawAudioDataViewController: BaseViewController { NetworkManager.shared.generateToken(channelName: channelId, success: { token in let result = self.agoraKit.joinChannel(byToken: token, channelId: channelId, info: nil, uid: 0) if result != 0 { - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "Join channel failed with errorCode: \(result)") } }) @@ -117,8 +117,8 @@ extension RawAudioDataViewController: AgoraAudioFrameDelegate { // MARK: - AgoraRtcEngineDelegate extension RawAudioDataViewController: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code LogUtils.log(message: "Error occur: \(errorCode)", level: .error) self.showAlert(title: "Error", message: "Error: \(errorCode.description)") } diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RhythmPlayer/RhythmPlayer.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RhythmPlayer/RhythmPlayer.swift index 7b1f3ae91..08fbdadb6 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RhythmPlayer/RhythmPlayer.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/RhythmPlayer/RhythmPlayer.swift @@ -104,8 +104,8 @@ class RhythmPlayerMain : BaseViewController if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -159,8 +159,8 @@ extension RhythmPlayerMain : AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/SpatialAudio/SpatialAudio.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/SpatialAudio/SpatialAudio.swift index 3ee89b800..8486a07a6 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/SpatialAudio/SpatialAudio.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/SpatialAudio/SpatialAudio.swift @@ -50,9 +50,9 @@ class SpatialAudioMain: BaseViewController { var currentAngle = 0.0 var currentDistance = 0.0 var maxDistance: CGFloat = 10 - let forward = [NSNumber(1.0), NSNumber(0.0), NSNumber(0.0)] - let right = [NSNumber(0.0), NSNumber(1.0), NSNumber(0.0)] - let up = [NSNumber(0.0), NSNumber(0.0), NSNumber(1.0)] + let forward = simd_float3(1.0, 0.0, 0.0) + let right = simd_float3(0.0, 1.0, 0.0) + let up = simd_float3(0.0, 0.0, 1.0) override func viewDidLoad() { super.viewDidLoad() @@ -63,7 +63,6 @@ class SpatialAudioMain: BaseViewController { Util.configPrivatization(agoraKit: agoraKit) agoraKit.setChannelProfile(.liveBroadcasting) agoraKit.setClientRole(GlobalSettings.shared.getUserRole()) - agoraKit.muteAllRemoteAudioStreams(true) agoraKit.setParameters("{\"rtc.enable_debug_log\":true}") agoraKit.setLogFile(LogUtils.sdkLogPath()) agoraKit.enableAudio() @@ -72,8 +71,6 @@ class SpatialAudioMain: BaseViewController { let localSpatialConfig = AgoraLocalSpatialAudioConfig() localSpatialConfig.rtcEngine = agoraKit localSpatial = AgoraLocalSpatialAudioKit.sharedLocalSpatialAudio(with: localSpatialConfig) - localSpatial.muteLocalAudioStream(false) - localSpatial.muteAllRemoteAudioStreams(false) localSpatial.setAudioRecvRange(Float(SCREENSIZE.height)) localSpatial.setMaxAudioRecvCount(2) localSpatial.setDistanceUnit(1) @@ -175,14 +172,13 @@ class SpatialAudioMain: BaseViewController { audioZone.position = getViewCenterPostion(view: voiceContainerView1) localSpatial.setZones([audioZone]) } else { - let audioZone = AgoraSpatialAudioZone() - audioZone.forwardLength = Float(SCREENSIZE.height) - audioZone.rightLength = Float(SCREENSIZE.width) - audioZone.upLength = Float(maxDistance) - localSpatial.setZones([audioZone]) + localSpatial.setZones(nil) } let pos = getViewCenterPostion(view: selfPostionView) - localSpatial.updateSelfPosition(pos, axisForward: forward, axisRight: right, axisUp: up) + localSpatial.updateSelfPosition(pos, + axisForward: forward, + axisRight: right, + axisUp: up) } private func updateMediaPlayerParams(mediaPlayer: AgoraRtcMediaPlayerProtocol, @@ -209,7 +205,10 @@ class SpatialAudioMain: BaseViewController { func updatePosition() { let pos = getViewCenterPostion(view: selfPostionView) - localSpatial.updateSelfPosition(pos, axisForward: forward, axisRight: right, axisUp: up) + localSpatial.updateSelfPosition(pos, + axisForward: forward, + axisRight: right, + axisUp: up) } private func getPlayerPostion(view: UIView) -> AgoraRemoteVoicePositionInfo { @@ -220,8 +219,8 @@ class SpatialAudioMain: BaseViewController { positionInfo.forward = forward return positionInfo } - private func getViewCenterPostion(view: UIView) -> [NSNumber] { - [NSNumber(value: Double(view.center.x)), NSNumber(value: Double(view.center.y)), NSNumber(0.0)] + private func getViewCenterPostion(view: UIView) -> simd_float3 { + simd_float3(Float(view.center.x), Float(view.center.y), 0.0) } private func updateSpatialAngle(objectCenter: CGPoint) -> AgoraRemoteVoicePositionInfo { @@ -248,8 +247,8 @@ class SpatialAudioMain: BaseViewController { let posForward = spatialDistance * cos(currentAngle); let posRight = spatialDistance * sin(currentAngle); - let position = [NSNumber(value: posForward), NSNumber(value: posRight), NSNumber(0.0)] - let forward = [NSNumber(1.0), NSNumber(0.0), NSNumber(0.0)] + let position = simd_float3(Float(posForward), Float(posRight), 0.0) + let forward = simd_float3(1.0, 0.0, 0.0) let positionInfo = AgoraRemoteVoicePositionInfo() positionInfo.position = position @@ -266,10 +265,12 @@ extension SpatialAudioMain: AgoraRtcEngineDelegate { remoteUserButton1.setTitle("\(uid)", for: .normal) remoteUserButton1.tag = Int(uid) remoteUserButton1.isHidden = false + localSpatial.updateRemotePosition(uid, positionInfo: getPlayerPostion(view: remoteUserButton1)) } else if remoteUserButton2.tag <= 0 { remoteUserButton2.setTitle("\(uid)", for: .normal) remoteUserButton2.tag = Int(uid) remoteUserButton2.isHidden = false + localSpatial.updateRemotePosition(uid, positionInfo: getPlayerPostion(view: remoteUserButton2)) } } @@ -287,12 +288,13 @@ extension SpatialAudioMain: AgoraRtcEngineDelegate { remoteUserButton2.isHidden = true remoteUserButton2.tag = 0 } + localSpatial.removeRemotePosition(uid) } } extension SpatialAudioMain: AgoraRtcMediaPlayerDelegate { - func AgoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, didChangedTo state: AgoraMediaPlayerState, error: AgoraMediaPlayerError) { - print("didChangedTo: \(state.rawValue), \(error.rawValue)") + func AgoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, didChangedTo state: AgoraMediaPlayerState, reason: AgoraMediaPlayerReason) { + print("didChangedTo: \(state.rawValue), \(reason.rawValue)") if state == .openCompleted || state == .playBackAllLoopsCompleted || state == .playBackCompleted { playerKit.play() } diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/VoiceChanger/VoiceChanger.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/VoiceChanger/VoiceChanger.swift index 332d43e77..cdd93c6fc 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/VoiceChanger/VoiceChanger.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Advanced/VoiceChanger/VoiceChanger.swift @@ -398,7 +398,8 @@ class VoiceChangerMain: BaseViewController { // parameter of setAudioProfile to AUDIO_PROFILE_MUSIC_HIGH_QUALITY(4) // or AUDIO_PROFILE_MUSIC_HIGH_QUALITY_STEREO(5), and to set // scenario parameter to AUDIO_SCENARIO_GAME_STREAMING(3). - agoraKit.setAudioProfile(.musicHighQualityStereo, scenario: .gameStreaming) + agoraKit.setAudioProfile(.musicHighQualityStereo) + agoraKit.setAudioScenario(.gameStreaming) // make myself a broadcaster agoraKit.setChannelProfile(.liveBroadcasting) @@ -431,8 +432,8 @@ class VoiceChangerMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -467,8 +468,8 @@ extension VoiceChangerMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio(Token)/JoinChannelAudioToken.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio(Token)/JoinChannelAudioToken.swift index b4cf6d4cd..fb6b80514 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio(Token)/JoinChannelAudioToken.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio(Token)/JoinChannelAudioToken.swift @@ -212,8 +212,8 @@ class JoinChannelAudioTokenMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } } @@ -289,8 +289,8 @@ extension JoinChannelAudioTokenMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio/JoinChannelAudio.swift b/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio/JoinChannelAudio.swift index 140471b95..bb1a88f1c 100644 --- a/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio/JoinChannelAudio.swift +++ b/iOS/APIExample-Audio/APIExample-Audio/Examples/Basic/JoinChannelAudio/JoinChannelAudio.swift @@ -153,8 +153,8 @@ class JoinChannelAudioMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -231,8 +231,8 @@ extension JoinChannelAudioMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample-Audio/Podfile b/iOS/APIExample-Audio/Podfile index 9ad892601..0dc8aaefd 100644 --- a/iOS/APIExample-Audio/Podfile +++ b/iOS/APIExample-Audio/Podfile @@ -7,7 +7,7 @@ target 'APIExample-Audio' do pod 'Floaty', '~> 4.2.0' pod 'AGEVideoLayout', '~> 1.0.2' - pod 'AgoraAudio_iOS', '4.2.6' - # pod 'sdk', :path => 'sdk.podspec' + pod 'AgoraAudio_iOS', '4.3.0' + # pod 'sdk', :path => 'sdk.podspec' end diff --git a/iOS/APIExample-OC/APIExample-OC/Common/ExternalVideo/AgoraMetalRender.swift b/iOS/APIExample-OC/APIExample-OC/Common/ExternalVideo/AgoraMetalRender.swift index 67ed080b6..54f9cdda9 100644 --- a/iOS/APIExample-OC/APIExample-OC/Common/ExternalVideo/AgoraMetalRender.swift +++ b/iOS/APIExample-OC/APIExample-OC/Common/ExternalVideo/AgoraMetalRender.swift @@ -162,7 +162,10 @@ extension AgoraMetalRender: AgoraVideoFrameDelegate { } func getVideoFormatPreference() -> AgoraVideoFormat { - return .NV12 + .cvPixelNV12 + } + func getObservedFramePosition() -> AgoraVideoFramePosition { + .preRenderer } } diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/AudioMixing/AudioMixing.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/AudioMixing/AudioMixing.m index 151ee615e..b0f220ab2 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/AudioMixing/AudioMixing.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/AudioMixing/AudioMixing.m @@ -188,8 +188,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -212,9 +212,6 @@ - (IBAction)onStartAudioMixing:(UIButton *)sender { int result = [self.agoraKit startAudioMixing:filePath loopback:NO cycle:-1]; if (result != 0) { [self showAlertWithTitle:@"Error" message:[NSString stringWithFormat:@"stopAudioMixing call failed: %d, please check your params",result]]; - } else { - [self stopProgressTimer]; - [self updateTotalDuration: YES]; } } - (IBAction)onStopAudioMixing:(UIButton *)sender { @@ -303,8 +300,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ContentInspect/ContentInspect.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ContentInspect/ContentInspect.m index 4a31bbe0b..d495d9f35 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ContentInspect/ContentInspect.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ContentInspect/ContentInspect.m @@ -110,8 +110,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -133,8 +133,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CreateDataStream/CreateDataStream.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CreateDataStream/CreateDataStream.m index 56e52037c..e618145b8 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CreateDataStream/CreateDataStream.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CreateDataStream/CreateDataStream.m @@ -119,8 +119,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -161,8 +161,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomAudioRender/CustomAudioRender.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomAudioRender/CustomAudioRender.m index 1b2c079d2..e25523865 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomAudioRender/CustomAudioRender.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomAudioRender/CustomAudioRender.m @@ -129,8 +129,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -146,8 +146,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.m index a4a68427c..fc5f76fe2 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.m @@ -130,8 +130,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -169,8 +169,8 @@ - (void)onAudioFrame:(void *)data { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoRender/CustomVideoRender.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoRender/CustomVideoRender.m index e5432e86c..7f312463b 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoRender/CustomVideoRender.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoRender/CustomVideoRender.m @@ -122,8 +122,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -141,8 +141,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.m index b53781ef5..07832ef30 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.m @@ -88,6 +88,7 @@ - (void)viewDidLoad { // so you will have to prepare the preview yourself self.customCamera = [[AgoraYUVImageSourcePush alloc] initWithSize:CGSizeMake(320, 180) fileName:@"sample" frameRate:15]; self.customCamera.delegate = self; + self.customCamera.trackId = 0; [self.customCamera startSource]; [self.agoraKit setExternalVideoSource:YES useTexture:YES @@ -112,8 +113,8 @@ - (void)viewDidLoad { AgoraRtcChannelMediaOptions *options = [[AgoraRtcChannelMediaOptions alloc] init]; options.autoSubscribeAudio = YES; options.autoSubscribeVideo = YES; - options.publishCameraTrack = YES; - options.publishMicrophoneTrack = YES; + options.publishCustomAudioTrack = YES; + options.publishCustomVideoTrack = YES; options.clientRoleType = AgoraClientRoleBroadcaster; [[NetworkManager shared] generateTokenWithChannelName:channelName uid:0 success:^(NSString * _Nullable token) { @@ -121,8 +122,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -156,7 +157,7 @@ - (void)onVideoFrame:(CVPixelBufferRef)buffer size:(CGSize)size trackId:(NSUInte videoFrame.format = 12; videoFrame.textureBuf = buffer; videoFrame.rotation = rotation; - [self.agoraKit pushExternalVideoFrame:videoFrame]; + [self.agoraKit pushExternalVideoFrame:videoFrame videoTrackId:trackId]; AgoraOutputVideoFrame *outputVideoFrame = [[AgoraOutputVideoFrame alloc] init]; outputVideoFrame.width = size.width; @@ -170,8 +171,8 @@ - (void)onVideoFrame:(CVPixelBufferRef)buffer size:(CGSize)size trackId:(NSUInte /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/FusionCDN/FusionCDN.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/FusionCDN/FusionCDN.m index c22bbadeb..070a141f9 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/FusionCDN/FusionCDN.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/FusionCDN/FusionCDN.m @@ -252,8 +252,8 @@ - (void)switchToRtcStreaming { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -270,7 +270,7 @@ - (void)viewDidDisappear:(BOOL)animated { [AgoraRtcEngineKit destroy]; } -- (void)onDirectCdnStreamingStateChanged:(AgoraDirectCdnStreamingState)state error:(AgoraDirectCdnStreamingError)error message:(NSString *)message { +- (void)onDirectCdnStreamingStateChanged:(AgoraDirectCdnStreamingState)state reason:(AgoraDirectCdnStreamingReason)reason message:(NSString *)message { dispatch_async(dispatch_get_main_queue(), ^{ switch (state) { case AgoraDirectCdnStreamingStateRunning: @@ -294,7 +294,7 @@ - (void)onDirectCdnStreamingStateChanged:(AgoraDirectCdnStreamingState)state err case AgoraDirectCdnStreamingStateFailed: [self showAlertWithTitle:@"Error" message:@"Start Streaming failed, please go back to previous page and check the settings."]; default: - [LogUtil log:[NSString stringWithFormat:@"onDirectCdnStreamingStateChanged: %ld, %ld %@", state, error, message] level:(LogLevelDebug)]; + [LogUtil log:[NSString stringWithFormat:@"onDirectCdnStreamingStateChanged: %ld, %ld %@", state, reason, message] level:(LogLevelDebug)]; break; } }); @@ -303,8 +303,8 @@ - (void)onDirectCdnStreamingStateChanged:(AgoraDirectCdnStreamingState)state err /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; @@ -328,8 +328,8 @@ - (void)rtcEngine:(AgoraRtcEngineKit *)engine streamUnpublishedWithUrl:(NSString [self.containerView layoutStream:@[self.localView]]; } -- (void)rtcEngine:(AgoraRtcEngineKit *)engine rtmpStreamingChangedToState:(NSString *)url state:(AgoraRtmpStreamingState)state errCode:(AgoraRtmpStreamingErrorCode)errCode { - [LogUtil log:[NSString stringWithFormat:@"On rtmpStreamingChangedToState, state: %ld errCode: %ld", state, errCode] level:(LogLevelDebug)]; +- (void)rtcEngine:(AgoraRtcEngineKit *)engine rtmpStreamingChangedToState:(NSString *)url state:(AgoraRtmpStreamingState)state reason:(AgoraRtmpStreamingReason)reason { + [LogUtil log:[NSString stringWithFormat:@"On rtmpStreamingChangedToState, state: %ld reason: %ld", state, reason] level:(LogLevelDebug)]; } /// callback when a remote user is joinning the channel, note audience in live broadcast mode will NOT trigger this event @@ -501,8 +501,8 @@ - (IBAction)setRtcStreaming:(UISwitch *)sender { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } else { AgoraRtcVideoCanvas *videoCanvas = [[AgoraRtcVideoCanvas alloc] init]; @@ -551,8 +551,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; @@ -601,12 +601,12 @@ - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOfflineOfUid:(NSUInteger)uid re [LogUtil log:[NSString stringWithFormat:@"remote user left: %lu", uid] level:(LogLevelDebug)]; } -- (void)AgoraRtcMediaPlayer:(id)playerKit didChangedToState:(AgoraMediaPlayerState)state error:(AgoraMediaPlayerError)error { - [LogUtil log:[NSString stringWithFormat:@"player rtc channel publish helper state changed to: %ld error: %ld", state, error] level:(LogLevelDebug)]; +- (void)AgoraRtcMediaPlayer:(id)playerKit didChangedToState:(AgoraMediaPlayerState)state reason:(AgoraMediaPlayerReason)reason { + [LogUtil log:[NSString stringWithFormat:@"player rtc channel publish helper state changed to: %ld error: %ld", state, reason] level:(LogLevelDebug)]; dispatch_async(dispatch_get_main_queue(), ^{ switch (state) { case AgoraMediaPlayerStateFailed: - [self showAlertWithTitle:[NSString stringWithFormat:@"media player error: %ld", error]]; + [self showAlertWithTitle:[NSString stringWithFormat:@"media player error: %ld", reason]]; break; case AgoraMediaPlayerStateOpenCompleted: diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.m index 4bfefaa79..fe9dd7b3b 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.m @@ -43,8 +43,8 @@ @implementation Channel2Delegate /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; @@ -206,8 +206,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -225,8 +225,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -262,8 +262,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/LiveStreaming/LiveStreaming.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/LiveStreaming/LiveStreaming.m index fc301f579..de9f417fc 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/LiveStreaming/LiveStreaming.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/LiveStreaming/LiveStreaming.m @@ -241,8 +241,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -379,7 +379,7 @@ - (IBAction)onTapWatermarkSwitch:(UISwitch *)sender { } - (IBAction)onTapDualStreamSwitch:(UISwitch *)sender { - [self.agoraKit enableDualStreamMode:sender.isOn]; + [self.agoraKit setDualStreamMode:sender.isOn ? AgoraEnableSimulcastStream : AgoraDisableSimulcastStream]; self.dualStreamTipsLabel.text = sender.isOn ? @"宸插紑鍚" : @"榛樿: 澶ф祦"; } @@ -403,8 +403,8 @@ - (IBAction)onToggleUltraLowLatency:(UISwitch *)sender { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.m index 266421e58..8beb99cb2 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.m @@ -126,8 +126,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -170,8 +170,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaPlayer/MediaPlayer.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaPlayer/MediaPlayer.m index 3adf44894..2420e3bc9 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaPlayer/MediaPlayer.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MediaPlayer/MediaPlayer.m @@ -188,8 +188,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -208,8 +208,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } [self doOpenMediaUrlWithSender: nil]; @@ -269,13 +269,13 @@ - (void)viewDidDisappear:(BOOL)animated { [AgoraRtcEngineKit destroy]; } -- (void)AgoraRtcMediaPlayer:(id)playerKit didChangedToState:(AgoraMediaPlayerState)state error:(AgoraMediaPlayerError)error { - [LogUtil log:[NSString stringWithFormat:@"player rtc channel publish helper state changed to: %ld error: %ld", state, error]]; +- (void)AgoraRtcMediaPlayer:(id)playerKit didChangedToState:(AgoraMediaPlayerState)state reason:(AgoraMediaPlayerReason)reason { + [LogUtil log:[NSString stringWithFormat:@"player rtc channel publish helper state changed to: %ld error: %ld", state, reason]]; __weak MediaPlayer *weakSelf = self; dispatch_async(dispatch_get_main_queue(), ^{ switch (state) { case AgoraMediaPlayerStateFailed: - [weakSelf showAlertWithTitle:[NSString stringWithFormat:@"media player error: %ld", error]]; + [weakSelf showAlertWithTitle:[NSString stringWithFormat:@"media player error: %ld", reason]]; break; case AgoraMediaPlayerStateOpenCompleted: @@ -325,8 +325,8 @@ - (void)AgoraRtcMediaPlayer:(id)playerKit didChange /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MutliCamera/MutliCamera.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MutliCamera/MutliCamera.m index 68806e8c2..0f62f1bf5 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MutliCamera/MutliCamera.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/MutliCamera/MutliCamera.m @@ -131,8 +131,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -196,8 +196,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/PictureInPicture/PictureInPicture.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/PictureInPicture/PictureInPicture.m index bb3545c7c..1d5b4c979 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/PictureInPicture/PictureInPicture.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/PictureInPicture/PictureInPicture.m @@ -134,8 +134,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -201,8 +201,8 @@ - (void)pictureInPictureControllerDidStopPictureInPicture:(AVPictureInPictureCon /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RTMPStreaming/RTMPStreaming.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RTMPStreaming/RTMPStreaming.m index 27c6157ba..ab2523f43 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RTMPStreaming/RTMPStreaming.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RTMPStreaming/RTMPStreaming.m @@ -151,8 +151,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -208,8 +208,8 @@ - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; @@ -276,8 +276,8 @@ - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOfflineOfUid:(NSUInteger)uid re } } -- (void)rtcEngine:(AgoraRtcEngineKit *)engine rtmpStreamingChangedToState:(NSString *)url state:(AgoraRtmpStreamingState)state errCode:(AgoraRtmpStreamingErrorCode)errCode { - [LogUtil log:[NSString stringWithFormat:@"streamStateChanged: %@ state %ld error %ld", url, state, errCode] level:(LogLevelDebug)]; +-(void)rtcEngine:(AgoraRtcEngineKit *)engine rtmpStreamingChangedToState:(NSString *)url state:(AgoraRtmpStreamingState)state reason:(AgoraRtmpStreamingReason)reason { + [LogUtil log:[NSString stringWithFormat:@"streamStateChanged: %@ state %ld reason %ld", url, state, reason] level:(LogLevelDebug)]; if (state == AgoraRtmpStreamingStateRunning) { [self showAlertWithTitle:@"Notice" message:@"RTMP Publish SUccess"]; self.isPublished = YES; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawAudioData/RawAudioData.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawAudioData/RawAudioData.m index 7ca1b2024..b03235ebc 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawAudioData/RawAudioData.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawAudioData/RawAudioData.m @@ -113,8 +113,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -156,8 +156,8 @@ - (BOOL)onPlaybackAudioFrameBeforeMixing:(AgoraAudioFrame *)frame channelId:(NSS /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawVideoData/RawVideoData.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawVideoData/RawVideoData.m index 70d7af4e6..ad66220f0 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawVideoData/RawVideoData.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RawVideoData/RawVideoData.m @@ -126,8 +126,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -160,11 +160,15 @@ - (BOOL)onRenderVideoFrame:(AgoraOutputVideoFrame *)videoFrame uid:(NSUInteger)u return YES; } +- (AgoraVideoFormat)getVideoFormatPreference { + return AgoraVideoFormatCVPixelBGRA; +} + /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RhythmPlayer/RhythmPlayer.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RhythmPlayer/RhythmPlayer.m index 345f9c875..53cadbef3 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RhythmPlayer/RhythmPlayer.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/RhythmPlayer/RhythmPlayer.m @@ -125,8 +125,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -160,8 +160,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ScreenShare/ScreenShare.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ScreenShare/ScreenShare.m index 3ead8bcba..9d606e1cb 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ScreenShare/ScreenShare.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/ScreenShare/ScreenShare.m @@ -166,8 +166,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -256,8 +256,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; @@ -301,7 +301,8 @@ - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOfflineOfUid:(NSUInteger)uid re self.remoteView.uid = 0; [LogUtil log:[NSString stringWithFormat:@"remote user left: %lu", uid] level:(LogLevelDebug)]; } -- (void)rtcEngine:(AgoraRtcEngineKit *)engine localVideoStateChangedOfState:(AgoraVideoLocalState)state error:(AgoraLocalVideoStreamError)error sourceType:(AgoraVideoSourceType)sourceType { + +- (void)rtcEngine:(AgoraRtcEngineKit *)engine localVideoStateChangedOfState:(AgoraVideoLocalState)state reason:(AgoraLocalVideoStreamReason)reason sourceType:(AgoraVideoSourceType)sourceType { if (state == AgoraVideoLocalStateCapturing && sourceType == AgoraVideoSourceTypeScreen) { self.option.publishScreenCaptureAudio = YES; self.option.publishScreenCaptureVideo = YES; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SimpleFilter/SimpleFilter.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SimpleFilter/SimpleFilter.m index fcc38cd7b..db04971b7 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SimpleFilter/SimpleFilter.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SimpleFilter/SimpleFilter.m @@ -132,8 +132,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -169,8 +169,8 @@ - (void)onEvent:(NSString *)provider extension:(NSString *)extension key:(NSStri /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SpatialAudio/SpatialAudio.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SpatialAudio/SpatialAudio.m index 4423090d1..499203b27 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SpatialAudio/SpatialAudio.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/SpatialAudio/SpatialAudio.m @@ -54,9 +54,9 @@ @interface SpatialAudio () @property (nonatomic, assign) CGFloat currentAngle; @property (nonatomic, assign) CGFloat currentDistance; @property (nonatomic, assign) CGFloat maxDistance; -@property (nonatomic, strong) NSArray *forward; -@property (nonatomic, strong) NSArray *right; -@property (nonatomic, strong) NSArray *up; +@property (nonatomic, assign) simd_float3 forward; +@property (nonatomic, assign) simd_float3 right; +@property (nonatomic, assign) simd_float3 up; @end @@ -79,9 +79,9 @@ - (void)viewDidLoad { self.currentAngle = 0; self.currentDistance = 0; self.maxDistance = 10; - self.forward = @[@(1.0), @(0.0), @(0.0)]; - self.right = @[@(0.0), @(1.0), @(0.0)]; - self.up = @[@(0.0), @(0.0), @(1.0)]; + self.forward = simd_make_float3(1.0, 0.0, 0.0); + self.right = simd_make_float3(0.0, 1.0, 0.0); + self.up = simd_make_float3(0.0, 0.0, 1.0); self.infoLabel.text = @"Please move the red icon to experience the 3D audio effect".localized; [self.voiceButton1 setTitle:@"" forState:(UIControlStateNormal)]; @@ -102,7 +102,6 @@ - (void)viewDidLoad { config.channelProfile = AgoraChannelProfileLiveBroadcasting; self.agoraKit = [AgoraRtcEngineKit sharedEngineWithConfig:config delegate:self]; - [self.agoraKit muteAllRemoteAudioStreams:YES]; // make myself a broadcaster [self.agoraKit setClientRole:(AgoraClientRoleBroadcaster)]; // enable video module and set up video encoding configs @@ -114,8 +113,6 @@ - (void)viewDidLoad { AgoraLocalSpatialAudioConfig * localSpatialConfig = [[AgoraLocalSpatialAudioConfig alloc] init]; localSpatialConfig.rtcEngine = self.agoraKit; self.localSpatial = [AgoraLocalSpatialAudioKit sharedLocalSpatialAudioWithConfig:localSpatialConfig]; - [self.localSpatial muteLocalAudioStream:NO]; - [self.localSpatial muteAllRemoteAudioStreams:NO]; [self.localSpatial setAudioRecvRange:[UIScreen mainScreen].bounds.size.height]; [self.localSpatial setMaxAudioRecvCount:2]; [self.localSpatial setDistanceUnit:1]; @@ -162,14 +159,13 @@ - (IBAction)onTapAudioAttenuationSwitch:(UISwitch *)sender { audioZone.position = [self getViewCenterPostion:self.voiceContainerView1]; [self.localSpatial setZones:@[audioZone]]; } else { - AgoraSpatialAudioZone *audioZone = [[AgoraSpatialAudioZone alloc] init]; - audioZone.forwardLength = [UIScreen mainScreen].bounds.size.height; - audioZone.rightLength = [UIScreen mainScreen].bounds.size.width; - audioZone.upLength = self.maxDistance; - [self.localSpatial setZones:@[audioZone]]; + [self.localSpatial setZones:nil]; } - NSArray *pos = [self getViewCenterPostion:self.selfPostionView]; - [self.localSpatial updateSelfPosition:pos axisForward:self.forward axisRight:self.right axisUp:self.up]; + simd_float3 pos = [self getViewCenterPostion:self.selfPostionView]; + [self.localSpatial updateSelfPosition:pos + axisForward:self.forward + axisRight:self.right + axisUp:self.up]; } @@ -221,8 +217,8 @@ - (void)joinChannel { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -250,26 +246,29 @@ - (void)updateMediaPlayerParams: (id)mediaPlayer ac } - (void)updatePosition { - NSArray *pos = [self getViewCenterPostion:self.selfPostionView]; - [self.localSpatial updateSelfPosition:pos axisForward:self.forward axisRight:self.right axisUp:self.up]; + simd_float3 pos = [self getViewCenterPostion:self.selfPostionView]; + [self.localSpatial updateSelfPosition:pos + axisForward:self.forward + axisRight:self.right + axisUp:self.up]; } - (AgoraRemoteVoicePositionInfo *)getPlayerPostion: (UIView *)view { - NSArray *postion = [self getViewCenterPostion:view]; + simd_float3 postion = [self getViewCenterPostion:view]; AgoraRemoteVoicePositionInfo *postionInfo = [[AgoraRemoteVoicePositionInfo alloc] init]; postionInfo.position = postion; postionInfo.forward = self.forward; return postionInfo; } -- (NSArray *)getViewCenterPostion: (UIView *)view { - return @[@(view.center.x), @(view.center.y), @(0.0)]; +- (simd_float3)getViewCenterPostion: (UIView *)view { + return simd_make_float3(view.center.x, view.center.y, 0.0); } /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; @@ -289,10 +288,12 @@ - (void)rtcEngine:(AgoraRtcEngineKit *)engine didJoinedOfUid:(NSUInteger)uid ela [self.remoteUserButton1 setTitle:[NSString stringWithFormat:@"%lu",uid] forState:(UIControlStateNormal)]; self.remoteUserButton1.tag = uid; [self.remoteUserButton1 setHidden:NO]; + [self.localSpatial updateRemotePosition:uid positionInfo:[self getPlayerPostion:self.remoteUserButton1]]; } else if (self.remoteUserButton2.tag <= 0) { [self.remoteUserButton2 setTitle:[NSString stringWithFormat:@"%lu",uid] forState:(UIControlStateNormal)]; self.remoteUserButton2.tag = uid; [self.remoteUserButton2 setHidden:NO]; + [self.localSpatial updateRemotePosition:uid positionInfo:[self getPlayerPostion:self.remoteUserButton2]]; } } @@ -312,9 +313,10 @@ - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOfflineOfUid:(NSUInteger)uid re [self.remoteUserButton2 setHidden:YES]; self.remoteUserButton2.tag = 0; } + [self.localSpatial removeRemotePosition:uid]; } -- (void)AgoraRtcMediaPlayer:(id)playerKit didChangedToState:(AgoraMediaPlayerState)state error:(AgoraMediaPlayerError)error { +- (void)AgoraRtcMediaPlayer:(id)playerKit didChangedToState:(AgoraMediaPlayerState)state reason:(AgoraMediaPlayerReason)reason { if (state == AgoraMediaPlayerStateOpenCompleted || state == AgoraMediaPlayerStatePlayBackAllLoopsCompleted || state == AgoraMediaPlayerStatePlayBackCompleted) { [playerKit play]; } diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/StreamEncryption/StreamEncryption.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/StreamEncryption/StreamEncryption.m index ab1258e76..6da8c8545 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/StreamEncryption/StreamEncryption.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/StreamEncryption/StreamEncryption.m @@ -178,8 +178,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -202,8 +202,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoMetadata/VideoMetadata.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoMetadata/VideoMetadata.m index 091c05c2c..fda2c9f86 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoMetadata/VideoMetadata.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoMetadata/VideoMetadata.m @@ -124,8 +124,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -147,8 +147,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoProcess/VideoProcess.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoProcess/VideoProcess.m index b922d6682..74ed9bd87 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoProcess/VideoProcess.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VideoProcess/VideoProcess.m @@ -146,8 +146,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -259,8 +259,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VoiceChanger/VoiceChanger.m b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VoiceChanger/VoiceChanger.m index a707b42a9..a528e5cfa 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VoiceChanger/VoiceChanger.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Advanced/VoiceChanger/VoiceChanger.m @@ -502,8 +502,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -521,8 +521,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelAudio/JoinChannelAudio.m b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelAudio/JoinChannelAudio.m index ffedc5087..d4822310f 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelAudio/JoinChannelAudio.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelAudio/JoinChannelAudio.m @@ -109,8 +109,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -126,8 +126,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Recorder)/JoinChannelVideoRecorder.m b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Recorder)/JoinChannelVideoRecorder.m index f0fb71aae..3ec4f7f82 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Recorder)/JoinChannelVideoRecorder.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Recorder)/JoinChannelVideoRecorder.m @@ -203,8 +203,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -257,7 +257,12 @@ - (void)mediaRecorder:(AgoraMediaRecorder *)recorder informationDidUpdated:(NSSt NSLog(@"uid == %lu info == %@", uid, info.description); } -- (void)mediaRecorder:(AgoraMediaRecorder *)recorder stateDidChanged:(NSString *)channelId uid:(NSUInteger)uid state:(AgoraMediaRecorderState)state error:(AgoraMediaRecorderErrorCode)error { +- (void)mediaRecorder:(AgoraMediaRecorder * _Nonnull)recorder stateDidChanged:(NSString * _Nonnull)channelId uid:(NSUInteger)uid state:(AgoraMediaRecorderState)state reason:(AgoraMediaRecorderReasonCode)reason { + [LogUtil log: [NSString stringWithFormat:@"stateDidChanged uid == %lu state == %ld reason == %ld", uid, state, reason]]; +} + + +- (void)mediaRecorder:(AgoraMediaRecorder *)recorder stateDidChanged:(NSString *)channelId uid:(NSUInteger)uid state:(AgoraMediaRecorderState)state { NSLog(@"uid == %lu state == %ld", uid, state); } @@ -265,8 +270,8 @@ - (void)mediaRecorder:(AgoraMediaRecorder *)recorder stateDidChanged:(NSString * /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { NSLog(@"Error %ld occur",errorCode); diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Token)/JoinChannelVideoToken.m b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Token)/JoinChannelVideoToken.m index 1ef81c056..597ebacf2 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Token)/JoinChannelVideoToken.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo(Token)/JoinChannelVideoToken.m @@ -147,8 +147,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } } @@ -165,8 +165,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { NSLog(@"Error %ld occur",errorCode); diff --git a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo/JoinChannelVideo.m b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo/JoinChannelVideo.m index b2f73506f..f3fb97b42 100644 --- a/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo/JoinChannelVideo.m +++ b/iOS/APIExample-OC/APIExample-OC/Examples/Basic/JoinChannelVideo/JoinChannelVideo.m @@ -119,8 +119,8 @@ - (void)viewDidLoad { if (result != 0) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code NSLog(@"joinChannel call failed: %d, please check your params", result); } }]; @@ -138,8 +138,8 @@ - (void)viewDidDisappear:(BOOL)animated { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: -/// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content -/// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html +/// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode +/// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem - (void)rtcEngine:(AgoraRtcEngineKit *)engine didOccurError:(AgoraErrorCode)errorCode { [LogUtil log:[NSString stringWithFormat:@"Error %ld occur",errorCode] level:(LogLevelError)]; diff --git a/iOS/APIExample-OC/Podfile b/iOS/APIExample-OC/Podfile index c7120307e..a5ba13adf 100644 --- a/iOS/APIExample-OC/Podfile +++ b/iOS/APIExample-OC/Podfile @@ -5,20 +5,20 @@ target 'APIExample-OC' do # Comment the next line if you don't want to use dynamic frameworks use_frameworks! - pod 'AgoraRtcEngine_iOS', '4.2.6' - # pod 'sdk', :path => 'sdk.podspec' + pod 'AgoraRtcEngine_iOS', '4.3.0' + # pod 'sdk', :path => 'sdk.podspec' end target 'Agora-ScreenShare-Extension-OC' do use_frameworks! - # pod 'sdk', :path => 'sdk.podspec' -pod 'AgoraRtcEngine_iOS', '4.2.6' + # pod 'sdk', :path => 'sdk.podspec' + pod 'AgoraRtcEngine_iOS', '4.3.0' end target 'SimpleFilter' do use_frameworks! - # pod 'sdk', :path => 'sdk.podspec' - pod 'AgoraRtcEngine_iOS', '4.2.6' + # pod 'sdk', :path => 'sdk.podspec' + pod 'AgoraRtcEngine_iOS', '4.3.0' end diff --git a/iOS/APIExample/.swiftlint.yml b/iOS/APIExample/.swiftlint.yml new file mode 100644 index 000000000..a043fff25 --- /dev/null +++ b/iOS/APIExample/.swiftlint.yml @@ -0,0 +1,40 @@ +disabled_rules: # 绂佺敤鎸囧畾鐨勮鍒 + - todo #鍘诲仛 + - trailing_whitespace # 琛屾湯灏句笉鍔犵┖鏍 + - unneeded_override # 涓嶉渶瑕佺殑閲嶅啓鍑芥暟 + - identifier_name # 鍙傛暟鍙橀噺鍛藉悕瑙勫垯 + - class_delegate_protocol #delegate protocol 搴旇琚瀹氫负 class-only,鎵嶈兘琚急寮曠敤 + - type_body_length # 绫诲瀷浣撹鏁伴檺鍒 + - cyclomatic_complexity # 涓嶅簲璇ュ瓨鍦ㄥお澶嶆潅鐨勫嚱鏁帮紙鍒ゆ柇璇彞杩囧锛 + +opt_in_rules: # 鍚敤鎸囧畾鐨勮鍒 + - empty_count # 绌烘暟 + - missing_docs # 缂哄皯鏂囨。 + - closure_end_indentation #闂悎绔帇鐥 + - empty_parentheses_with_trailing_closure #甯﹀熬闅忛棴鍖呯殑绌烘嫭鍙 + - duplicate_imports #閲嶅瀵煎叆 + - force_unwrapping # 寮哄埗瑙e寘 + - nesting #宓屽 + - operator_whitespace # 杩愮畻绗 鍑芥暟 绌虹櫧 + - switch_case_alignment #Switch 鍜 Case 璇彞瀵归綈 + +excluded: # 鎵ц linting 鏃跺拷鐣ョ殑璺緞銆 浼樺厛绾ф瘮 `included` 鏇撮珮銆 + - Carthage + - Pods + +# rules that have both warning and error levels, can set just the warning level +line_length: 145 # 涓琛岀殑瀛楃闀垮害涓嶈秴杩120涓紝鍚﹀垯浼氭湁warning + +large_tuple: 4 + +# error #鍑芥暟浣撻暱搴 榛樿瓒呰繃40琛寃arning,瓒呰繃100琛岀洿鎺ユ姤閿 +function_body_length: + warning: 60 + error: 100 + +file_length: # 鏂囦欢琛屾暟瓒呰繃500琛屼細鏈墂arning锛岃秴杩1200琛屼細鏈塭rror + warning: 800 + error: 1200 + +force_unwrapping: + severity: error \ No newline at end of file diff --git a/iOS/APIExample/APIExample.xcodeproj/project.pbxproj b/iOS/APIExample/APIExample.xcodeproj/project.pbxproj index 5a5b4d40f..c37bc7005 100644 --- a/iOS/APIExample/APIExample.xcodeproj/project.pbxproj +++ b/iOS/APIExample/APIExample.xcodeproj/project.pbxproj @@ -3,7 +3,7 @@ archiveVersion = 1; classes = { }; - objectVersion = 51; + objectVersion = 54; objects = { /* Begin PBXBuildFile section */ @@ -175,6 +175,9 @@ E77D54C728F55E9100D51C1E /* JoinChannelVideoToken.strings in Resources */ = {isa = PBXBuildFile; fileRef = E77D54C228F55E9100D51C1E /* JoinChannelVideoToken.strings */; }; E77D54C828F55E9100D51C1E /* JoinChannelVideoToken.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = E77D54C428F55E9100D51C1E /* JoinChannelVideoToken.storyboard */; }; E77D54C928F55E9100D51C1E /* JoinChannelVideoToken.swift in Sources */ = {isa = PBXBuildFile; fileRef = E77D54C628F55E9100D51C1E /* JoinChannelVideoToken.swift */; }; + E7883AE92B074746003CCF44 /* FaceCapture.strings in Resources */ = {isa = PBXBuildFile; fileRef = E7883AE42B074746003CCF44 /* FaceCapture.strings */; }; + E7883AEA2B074746003CCF44 /* FaceCapture.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = E7883AE62B074746003CCF44 /* FaceCapture.storyboard */; }; + E7883AEB2B074746003CCF44 /* FaceCapture.swift in Sources */ = {isa = PBXBuildFile; fileRef = E7883AE82B074746003CCF44 /* FaceCapture.swift */; }; E7899BDC2861673600851463 /* CreateDataStream.strings in Resources */ = {isa = PBXBuildFile; fileRef = E7899BD72861673600851463 /* CreateDataStream.strings */; }; E7899BDD2861673600851463 /* CreateDataStream.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = E7899BD92861673600851463 /* CreateDataStream.storyboard */; }; E7899BDE2861673600851463 /* CreateDataStream.swift in Sources */ = {isa = PBXBuildFile; fileRef = E7899BDB2861673600851463 /* CreateDataStream.swift */; }; @@ -491,6 +494,9 @@ E77D54C328F55E9100D51C1E /* zh-Hans */ = {isa = PBXFileReference; lastKnownFileType = text.plist.strings; name = "zh-Hans"; path = "zh-Hans.lproj/JoinChannelVideoToken.strings"; sourceTree = ""; }; E77D54C528F55E9100D51C1E /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/JoinChannelVideoToken.storyboard; sourceTree = ""; }; E77D54C628F55E9100D51C1E /* JoinChannelVideoToken.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = JoinChannelVideoToken.swift; sourceTree = ""; }; + E7883AE52B074746003CCF44 /* zh-Hans */ = {isa = PBXFileReference; lastKnownFileType = text.plist.strings; name = "zh-Hans"; path = "zh-Hans.lproj/FaceCapture.strings"; sourceTree = ""; }; + E7883AE72B074746003CCF44 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/FaceCapture.storyboard; sourceTree = ""; }; + E7883AE82B074746003CCF44 /* FaceCapture.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = FaceCapture.swift; sourceTree = ""; }; E7899BD82861673600851463 /* zh-Hans */ = {isa = PBXFileReference; lastKnownFileType = text.plist.strings; name = "zh-Hans"; path = "zh-Hans.lproj/CreateDataStream.strings"; sourceTree = ""; }; E7899BDA2861673600851463 /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/CreateDataStream.storyboard; sourceTree = ""; }; E7899BDB2861673600851463 /* CreateDataStream.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = CreateDataStream.swift; sourceTree = ""; }; @@ -1096,6 +1102,7 @@ A75A56D724A0603000D0089E /* Advanced */ = { isa = PBXGroup; children = ( + E7883AE32B074746003CCF44 /* FaceCapture */, E726BFFD2A96FD3A006870E2 /* AudioWaveform */, E726BFF42A949F70006870E2 /* AuidoRouterPlayer */, E7163F7B2964149800EBBD55 /* ARKit */, @@ -1293,6 +1300,16 @@ path = "JoinChannelVideo(Token)"; sourceTree = ""; }; + E7883AE32B074746003CCF44 /* FaceCapture */ = { + isa = PBXGroup; + children = ( + E7883AE42B074746003CCF44 /* FaceCapture.strings */, + E7883AE62B074746003CCF44 /* FaceCapture.storyboard */, + E7883AE82B074746003CCF44 /* FaceCapture.swift */, + ); + path = FaceCapture; + sourceTree = ""; + }; E7899BD62861673600851463 /* CreateDataStream */ = { isa = PBXGroup; children = ( @@ -1501,6 +1518,7 @@ 03D13BCA2448758900B599B3 /* Resources */, 1B6F6CF9B678035E221EAFDE /* [CP] Embed Pods Frameworks */, 0339BEBA25205B80007D4FDD /* Embed App Extensions */, + E76F80122AF0A7A200CCB9D6 /* ShellScript */, ); buildRules = ( ); @@ -1613,6 +1631,7 @@ E7A49CB829011E7500F06DD4 /* MutliCamera.strings in Resources */, E7A49CFA29029E0000F06DD4 /* ThirdBeautify.storyboard in Resources */, E7A49D0929067F8300F06DD4 /* SenseBeautify.strings in Resources */, + E7883AE92B074746003CCF44 /* FaceCapture.strings in Resources */, 033A9F7A252D8B5000BC26E1 /* MediaPlayer.storyboard in Resources */, 8BE7ABC3279E065000DFBCEF /* FusionCDN.storyboard in Resources */, 0339D6D224E91B80008739CD /* QuickSwitchChannelVCItem.xib in Resources */, @@ -1633,6 +1652,7 @@ 03414B5525546DEC00AB114D /* frames0.yuv in Resources */, E77D54C728F55E9100D51C1E /* JoinChannelVideoToken.strings in Resources */, 8B5E5B53274CBF760040E97D /* VideoProcess.storyboard in Resources */, + E7883AEA2B074746003CCF44 /* FaceCapture.storyboard in Resources */, 033A9F66252D8B2A00BC26E1 /* VoiceChanger.storyboard in Resources */, E7A49CB929011E7500F06DD4 /* MutliCamera.storyboard in Resources */, 033A9F52252D89E600BC26E1 /* CustomVideoRender.storyboard in Resources */, @@ -1760,6 +1780,23 @@ shellScript = "diff \"${PODS_PODFILE_DIR_PATH}/Podfile.lock\" \"${PODS_ROOT}/Manifest.lock\" > /dev/null\nif [ $? != 0 ] ; then\n # print error to STDERR\n echo \"error: The sandbox is not in sync with the Podfile.lock. Run 'pod install' or update your CocoaPods installation.\" >&2\n exit 1\nfi\n# This output is used by Xcode 'outputs' to avoid re-running this script phase.\necho \"SUCCESS\" > \"${SCRIPT_OUTPUT_FILE_0}\"\n"; showEnvVarsInLog = 0; }; + E76F80122AF0A7A200CCB9D6 /* ShellScript */ = { + isa = PBXShellScriptBuildPhase; + buildActionMask = 2147483647; + files = ( + ); + inputFileListPaths = ( + ); + inputPaths = ( + ); + outputFileListPaths = ( + ); + outputPaths = ( + ); + runOnlyForDeploymentPostprocessing = 0; + shellPath = /bin/sh; + shellScript = "\"${PODS_ROOT}/SwiftLint/swiftlint\" \n"; + }; /* End PBXShellScriptBuildPhase section */ /* Begin PBXSourcesBuildPhase section */ @@ -1857,6 +1894,7 @@ 67450169282D5D8B00E79F2F /* ContentInspect.swift in Sources */, 67B8C7B52805757200195106 /* MediaUtils.m in Sources */, 03D13C0124488F1F00B599B3 /* KeyCenter.swift in Sources */, + E7883AEB2B074746003CCF44 /* FaceCapture.swift in Sources */, 03BEED08251C35E7005E78F4 /* AudioMixing.swift in Sources */, E74877B728A23B8B00CA2F58 /* NetworkManager.swift in Sources */, 03DF1D9424CFC29700DF7151 /* AudioController.m in Sources */, @@ -2341,6 +2379,22 @@ name = JoinChannelVideoToken.storyboard; sourceTree = ""; }; + E7883AE42B074746003CCF44 /* FaceCapture.strings */ = { + isa = PBXVariantGroup; + children = ( + E7883AE52B074746003CCF44 /* zh-Hans */, + ); + name = FaceCapture.strings; + sourceTree = ""; + }; + E7883AE62B074746003CCF44 /* FaceCapture.storyboard */ = { + isa = PBXVariantGroup; + children = ( + E7883AE72B074746003CCF44 /* Base */, + ); + name = FaceCapture.storyboard; + sourceTree = ""; + }; E7899BD72861673600851463 /* CreateDataStream.strings */ = { isa = PBXVariantGroup; children = ( @@ -2533,6 +2587,7 @@ CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES; CLANG_WARN_OBJC_LITERAL_CONVERSION = YES; CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; + CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER = NO; CLANG_WARN_RANGE_LOOP_ANALYSIS = YES; CLANG_WARN_STRICT_PROTOTYPES = YES; CLANG_WARN_SUSPICIOUS_MOVE = YES; @@ -2594,6 +2649,7 @@ CLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES; CLANG_WARN_OBJC_LITERAL_CONVERSION = YES; CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR; + CLANG_WARN_QUOTED_INCLUDE_IN_FRAMEWORK_HEADER = NO; CLANG_WARN_RANGE_LOOP_ANALYSIS = YES; CLANG_WARN_STRICT_PROTOTYPES = YES; CLANG_WARN_SUSPICIOUS_MOVE = YES; diff --git a/iOS/APIExample/APIExample/AppDelegate.swift b/iOS/APIExample/APIExample/AppDelegate.swift index b30a136d5..0291a7f04 100644 --- a/iOS/APIExample/APIExample/AppDelegate.swift +++ b/iOS/APIExample/APIExample/AppDelegate.swift @@ -17,7 +17,4 @@ class AppDelegate: UIResponder, UIApplicationDelegate { // Override point for customization after application launch. return true } - - } - diff --git a/iOS/APIExample/APIExample/Common/ARKit/ARVideoRenderer.swift b/iOS/APIExample/APIExample/Common/ARKit/ARVideoRenderer.swift index ae6781de8..35fc82ae8 100755 --- a/iOS/APIExample/APIExample/Common/ARKit/ARVideoRenderer.swift +++ b/iOS/APIExample/APIExample/Common/ARKit/ARVideoRenderer.swift @@ -11,7 +11,7 @@ import MetalKit import SceneKit import AgoraRtcKit -class ARVideoRenderer : NSObject { +class ARVideoRenderer: NSObject { fileprivate var yTexture: MTLTexture? fileprivate var uTexture: MTLTexture? fileprivate var vTexture: MTLTexture? diff --git a/iOS/APIExample/APIExample/Common/AgoraExtension.swift b/iOS/APIExample/APIExample/Common/AgoraExtension.swift index 26b5905db..c477ec44d 100644 --- a/iOS/APIExample/APIExample/Common/AgoraExtension.swift +++ b/iOS/APIExample/APIExample/Common/AgoraExtension.swift @@ -119,51 +119,22 @@ extension AgoraEncryptionMode { } } -//extension AgoraAudioVoiceChanger { -// func description() -> String { -// switch self { -// case .voiceChangerOff:return "Off".localized -// case .generalBeautyVoiceFemaleFresh:return "FemaleFresh".localized -// case .generalBeautyVoiceFemaleVitality:return "FemaleVitality".localized -// case .generalBeautyVoiceMaleMagnetic:return "MaleMagnetic".localized -// case .voiceBeautyVigorous:return "Vigorous".localized -// case .voiceBeautyDeep:return "Deep".localized -// case .voiceBeautyMellow:return "Mellow".localized -// case .voiceBeautyFalsetto:return "Falsetto".localized -// case .voiceBeautyFull:return "Full".localized -// case .voiceBeautyClear:return "Clear".localized -// case .voiceBeautyResounding:return "Resounding".localized -// case .voiceBeautyRinging:return "Ringing".localized -// case .voiceBeautySpacial:return "Spacial".localized -// case .voiceChangerEthereal:return "Ethereal".localized -// case .voiceChangerOldMan:return "Old Man".localized -// case .voiceChangerBabyBoy:return "Baby Boy".localized -// case .voiceChangerBabyGirl:return "Baby Girl".localized -// case .voiceChangerZhuBaJie:return "ZhuBaJie".localized -// case .voiceChangerHulk:return "Hulk".localized -// default: -// return "\(self.rawValue)" -// } -// } -//} - -extension AgoraVoiceBeautifierPreset{ +extension AgoraVoiceBeautifierPreset { func description() -> String { switch self { - case .presetOff:return "Off".localized - case .presetChatBeautifierFresh:return "FemaleFresh".localized - case .presetChatBeautifierMagnetic:return "MaleMagnetic".localized - case .presetChatBeautifierVitality:return "FemaleVitality".localized - case .timbreTransformationVigorous:return "Vigorous".localized - case .timbreTransformationDeep:return "Deep".localized - case .timbreTransformationMellow:return "Mellow".localized - case .timbreTransformationFalsetto:return "Falsetto".localized - case .timbreTransformationFull:return "Full".localized - case .timbreTransformationClear:return "Clear".localized - case .timbreTransformationResounding:return "Resounding".localized - case .timbreTransformatRinging:return "Ringing".localized - default: - return "\(self.rawValue)" + case .presetOff: return "Off".localized + case .presetChatBeautifierFresh: return "FemaleFresh".localized + case .presetChatBeautifierMagnetic: return "MaleMagnetic".localized + case .presetChatBeautifierVitality: return "FemaleVitality".localized + case .timbreTransformationVigorous: return "Vigorous".localized + case .timbreTransformationDeep: return "Deep".localized + case .timbreTransformationMellow: return "Mellow".localized + case .timbreTransformationFalsetto: return "Falsetto".localized + case .timbreTransformationFull: return "Full".localized + case .timbreTransformationClear: return "Clear".localized + case .timbreTransformationResounding: return "Resounding".localized + case .timbreTransformatRinging: return "Ringing".localized + default: return "\(self.rawValue)" } } } @@ -171,17 +142,16 @@ extension AgoraVoiceBeautifierPreset{ extension AgoraAudioReverbPreset { func description() -> String { switch self { - case .off:return "Off".localized - case .fxUncle:return "FxUncle".localized - case .fxSister:return "FxSister".localized - case .fxPopular:return "Pop".localized - case .fxRNB:return "R&B".localized - case .fxVocalConcert:return "Vocal Concert".localized - case .fxKTV:return "KTV".localized - case .fxStudio:return "Studio".localized - case .fxPhonograph:return "Phonograph".localized - default: - return "\(self.rawValue)" + case .off: return "Off".localized + case .fxUncle: return "FxUncle".localized + case .fxSister: return "FxSister".localized + case .fxPopular: return "Pop".localized + case .fxRNB: return "R&B".localized + case .fxVocalConcert: return "Vocal Concert".localized + case .fxKTV: return "KTV".localized + case .fxStudio: return "Studio".localized + case .fxPhonograph: return "Phonograph".localized + default: return "\(self.rawValue)" } } } @@ -189,27 +159,26 @@ extension AgoraAudioReverbPreset { extension AgoraAudioEffectPreset { func description() -> String { switch self { - case .off:return "Off".localized - case .voiceChangerEffectUncle:return "FxUncle".localized - case .voiceChangerEffectOldMan:return "Old Man".localized - case .voiceChangerEffectBoy:return "Baby Boy".localized - case .voiceChangerEffectSister:return "FxSister".localized - case .voiceChangerEffectGirl:return "Baby Girl".localized - case .voiceChangerEffectPigKin:return "ZhuBaJie".localized - case .voiceChangerEffectHulk:return "Hulk".localized - case .styleTransformationRnb:return "R&B".localized - case .styleTransformationPopular:return "Pop".localized - case .roomAcousticsKTV:return "KTV".localized - case .roomAcousVocalConcer:return "Vocal Concert".localized - case .roomAcousStudio:return "Studio".localized - case .roomAcousPhonograph:return "Phonograph".localized - case .roomAcousVirtualStereo:return "Virtual Stereo".localized - case .roomAcousSpatial:return "Spacial".localized - case .roomAcousEthereal:return "Ethereal".localized - case .roomAcous3DVoice:return "3D Voice".localized - case .pitchCorrection:return "Pitch Correction".localized - default: - return "\(self.rawValue)" + case .off: return "Off".localized + case .voiceChangerEffectUncle: return "FxUncle".localized + case .voiceChangerEffectOldMan: return "Old Man".localized + case .voiceChangerEffectBoy: return "Baby Boy".localized + case .voiceChangerEffectSister: return "FxSister".localized + case .voiceChangerEffectGirl: return "Baby Girl".localized + case .voiceChangerEffectPigKin: return "ZhuBaJie".localized + case .voiceChangerEffectHulk: return "Hulk".localized + case .styleTransformationRnb: return "R&B".localized + case .styleTransformationPopular: return "Pop".localized + case .roomAcousticsKTV: return "KTV".localized + case .roomAcousVocalConcer: return "Vocal Concert".localized + case .roomAcousStudio: return "Studio".localized + case .roomAcousPhonograph: return "Phonograph".localized + case .roomAcousVirtualStereo: return "Virtual Stereo".localized + case .roomAcousSpatial: return "Spacial".localized + case .roomAcousEthereal: return "Ethereal".localized + case .roomAcous3DVoice: return "3D Voice".localized + case .pitchCorrection: return "Pitch Correction".localized + default: return "\(self.rawValue)" } } } @@ -217,18 +186,17 @@ extension AgoraAudioEffectPreset { extension AgoraAudioEqualizationBandFrequency { func description() -> String { switch self { - case .band31: return "31Hz" - case .band62: return "62Hz" - case .band125: return "125Hz" - case .band250: return "250Hz" - case .band500: return "500Hz" - case .band1K: return "1kHz" - case .band2K: return "2kHz" - case .band4K: return "4kHz" - case .band8K: return "8kHz" - case .band16K: return "16kHz" - @unknown default: - return "\(self.rawValue)" + case .band31: return "31Hz" + case .band62: return "62Hz" + case .band125: return "125Hz" + case .band250: return "250Hz" + case .band500: return "500Hz" + case .band1K: return "1kHz" + case .band2K: return "2kHz" + case .band4K: return "4kHz" + case .band8K: return "8kHz" + case .band16K: return "16kHz" + default: return "\(self.rawValue)" } } } @@ -236,13 +204,12 @@ extension AgoraAudioEqualizationBandFrequency { extension AgoraAudioReverbType { func description() -> String { switch self { - case .dryLevel: return "Dry Level".localized - case .wetLevel: return "Wet Level".localized - case .roomSize: return "Room Size".localized - case .wetDelay: return "Wet Delay".localized - case .strength: return "Strength".localized - @unknown default: - return "\(self.rawValue)" + case .dryLevel: return "Dry Level".localized + case .wetLevel: return "Wet Level".localized + case .roomSize: return "Room Size".localized + case .wetDelay: return "Wet Delay".localized + case .strength: return "Strength".localized + @unknown default: return "\(self.rawValue)" } } } @@ -250,11 +217,10 @@ extension AgoraAudioReverbType { extension AUDIO_AINS_MODE { func description() -> String { switch self { - case .AINS_MODE_AGGRESSIVE: return "AGGRESSIVE".localized - case .AINS_MODE_BALANCED: return "BALANCED".localized - case .AINS_MODE_ULTRALOWLATENCY: return "ULTRALOWLATENCY".localized - @unknown default: - return "\(self.rawValue)" + case .AINS_MODE_AGGRESSIVE: return "AGGRESSIVE".localized + case .AINS_MODE_BALANCED: return "BALANCED".localized + case .AINS_MODE_ULTRALOWLATENCY: return "ULTRALOWLATENCY".localized + @unknown default: return "\(self.rawValue)" } } } @@ -262,18 +228,12 @@ extension AUDIO_AINS_MODE { extension AgoraVoiceConversionPreset { func description() -> String { switch self { - case .off: - return "Off".localized - case .neutral: - return "Neutral".localized - case .sweet: - return "Sweet".localized - case .changerSolid: - return "Solid".localized - case .changerBass: - return "Bass".localized - @unknown default: - return "\(self.rawValue)" + case .off: return "Off".localized + case .neutral: return "Neutral".localized + case .sweet: return "Sweet".localized + case .changerSolid: return "Solid".localized + case .changerBass: return "Bass".localized + default: return "\(self.rawValue)" } } } @@ -315,18 +275,14 @@ extension OutputStream { /// - parameter allowLossyConversion: Whether to permit lossy conversion when writing the string. Defaults to `false`. /// /// - returns: Return total number of bytes written upon success. Return `-1` upon failure. - func write(_ string: String, encoding: String.Encoding = .utf8, allowLossyConversion: Bool = false) -> Int { if let data = string.data(using: encoding, allowLossyConversion: allowLossyConversion) { - let ret = data.withUnsafeBytes { - write($0, maxLength: data.count) - } - if(ret < 0) { - print("write fail: \(streamError.debugDescription)") + let bytes = data.withUnsafeBytes({ $0 }) + if let address = bytes.baseAddress { + write(address, maxLength: bytes.count) } } - return -1 } @@ -339,4 +295,3 @@ extension Date { return dateformat.string(from: self) } } - diff --git a/iOS/APIExample/APIExample/Common/AlertManager.swift b/iOS/APIExample/APIExample/Common/AlertManager.swift index 9cc18c11c..6a2cd0499 100644 --- a/iOS/APIExample/APIExample/Common/AlertManager.swift +++ b/iOS/APIExample/APIExample/Common/AlertManager.swift @@ -8,8 +8,8 @@ import UIKit import AVFoundation -public let cl_screenWidht = UIScreen.main.bounds.width -public let cl_screenHeight = UIScreen.main.bounds.height +let cl_screenWidht = UIScreen.main.bounds.width +let cl_screenHeight = UIScreen.main.bounds.height class AlertManager: NSObject { private struct AlertViewCache { var view: UIView? @@ -19,7 +19,6 @@ class AlertManager: NSObject { case center case bottom } - private static var vc: UIViewController? private static var containerView: UIView? private static var currentPosition: AlertPosition = .center @@ -37,7 +36,9 @@ class AlertManager: NSObject { containerView?.backgroundColor = UIColor(red: 0.0/255, green: 0.0/255, blue: 0.0/255, alpha: 0.0) } if didCoverDismiss { - (containerView as? UIButton)?.addTarget(self, action: #selector(tapView), for: .touchUpInside) + (containerView as? UIButton)?.addTarget(self, + action: #selector(tapView), + for: .touchUpInside) } guard let containerView = containerView else { return } containerView.addSubview(view) @@ -46,7 +47,7 @@ class AlertManager: NSObject { if alertPostion == .center { view.centerXAnchor.constraint(equalTo: containerView.centerXAnchor).isActive = true view.centerYAnchor.constraint(equalTo: containerView.centerYAnchor).isActive = true - }else{ + } else { bottomAnchor = view.bottomAnchor.constraint(equalTo: containerView.bottomAnchor) view.leadingAnchor.constraint(equalTo: containerView.leadingAnchor).isActive = true view.trailingAnchor.constraint(equalTo: containerView.trailingAnchor).isActive = true @@ -56,24 +57,31 @@ class AlertManager: NSObject { vc?.view.backgroundColor = UIColor.clear vc?.view.addSubview(containerView) vc?.modalPresentationStyle = .custom - UIViewController.cl_topViewController()?.present(vc!, animated: false) { + UIViewController.cl_topViewController()?.present(vc ?? UIViewController(), animated: false) { showAlertPostion(alertPostion: alertPostion, view: view) } } else { showAlertPostion(alertPostion: alertPostion, view: view) } - //娉ㄥ唽閿洏鍑虹幇閫氱煡 - NotificationCenter.default.addObserver(self, selector: #selector(keyboardWillShow(notification:)), name: UIApplication.keyboardWillShowNotification, object: nil) + // 娉ㄥ唽閿洏鍑虹幇閫氱煡 + NotificationCenter.default.addObserver(self, + selector: #selector(keyboardWillShow(notification:)), + name: UIApplication.keyboardWillShowNotification, + object: nil) - //娉ㄥ唽閿洏闅愯棌閫氱煡 - NotificationCenter.default.addObserver(self, selector: #selector(keyboardWillHide(notification:)), name: UIApplication.keyboardWillHideNotification, object: nil) + // 娉ㄥ唽閿洏闅愯棌閫氱煡 + NotificationCenter.default.addObserver(self, + selector: #selector(keyboardWillHide(notification:)), + name: UIApplication.keyboardWillHideNotification, + object: nil) } - private static func showAlertPostion(alertPostion: AlertPosition, view: UIView) { + private static func showAlertPostion(alertPostion: AlertPosition, + view: UIView) { containerView?.layoutIfNeeded() if alertPostion == .center { showCenterView(view: view) - }else{ + } else { bottomAnchor?.constant = view.frame.height bottomAnchor?.isActive = true containerView?.layoutIfNeeded() @@ -81,7 +89,7 @@ class AlertManager: NSObject { } } - private static func showCenterView(view: UIView){ + private static func showCenterView(view: UIView) { if !viewCache.isEmpty { viewCache.forEach({ $0.view?.alpha = 0 }) } @@ -94,7 +102,7 @@ class AlertManager: NSObject { }) } - private static func showBottomView(view: UIView){ + private static func showBottomView(view: UIView) { if !viewCache.isEmpty { viewCache.forEach({ $0.view?.alpha = 0 }) } @@ -116,7 +124,8 @@ class AlertManager: NSObject { }) } - static func hiddenView(all: Bool = true, completion: (() -> Void)? = nil){ + static func hiddenView(all: Bool = true, + completion: (() -> Void)? = nil) { if currentPosition == .bottom { guard let lastView = viewCache.last?.view else { return } bottomAnchor?.constant = lastView.frame.height @@ -145,19 +154,21 @@ class AlertManager: NSObject { } @objc - private static func tapView(){ + private static func tapView() { DispatchQueue.main.asyncAfter(deadline: DispatchTime(uptimeNanoseconds: UInt64(0.1))) { self.hiddenView() } } - private static var originFrame:CGRect = .zero + private static var originFrame: CGRect = .zero // 閿洏鏄剧ず - @objc private static func keyboardWillShow(notification: Notification) { + @objc + private static func keyboardWillShow(notification: Notification) { let keyboardHeight = (notification.userInfo?["UIKeyboardBoundsUserInfoKey"] as? CGRect)?.height - let y = cl_screenHeight - (keyboardHeight ?? 304) - containerView!.frame.height + guard let viewHeight = containerView?.frame.height else { return } + let y = cl_screenHeight - (keyboardHeight ?? 304) - viewHeight if originFrame.origin.y != y { - originFrame = containerView!.frame + originFrame = containerView?.frame ?? .zero } UIView.animate(withDuration: 0.25) { containerView?.frame.origin.y = y @@ -174,4 +185,3 @@ class AlertManager: NSObject { } } } - diff --git a/iOS/APIExample/APIExample/Common/BaseViewController.swift b/iOS/APIExample/APIExample/Common/BaseViewController.swift index 0afbb7495..9167e6ad7 100644 --- a/iOS/APIExample/APIExample/Common/BaseViewController.swift +++ b/iOS/APIExample/APIExample/Common/BaseViewController.swift @@ -9,9 +9,8 @@ import UIKit import AGEVideoLayout - class BaseViewController: AGViewController { - var configs: [String:Any] = [:] + var configs: [String: Any] = [:] override func viewDidLoad() { // self.navigationItem.rightBarButtonItem = UIBarButtonItem(title: "Show Log", // style: .plain, @@ -20,7 +19,8 @@ class BaseViewController: AGViewController { LogUtils.removeAll() } - @objc func showLog() { + @objc + func showLog() { let storyBoard: UIStoryboard = UIStoryboard(name: "Main", bundle: nil) let newViewController = storyBoard.instantiateViewController(withIdentifier: "LogViewController") self.present(newViewController, animated: true, completion: nil) @@ -36,7 +36,7 @@ class BaseViewController: AGViewController { self.present(alertController, animated: true, completion: nil) } - func getAudioLabel(uid:UInt, isLocal:Bool) -> String { + func getAudioLabel(uid: UInt, isLocal: Bool) -> String { return "AUDIO ONLY\n\(isLocal ? "Local" : "Remote")\n\(uid)" } } @@ -44,9 +44,7 @@ class BaseViewController: AGViewController { extension AGEVideoContainer { func layoutStream(views: [AGView]) { let count = views.count - var layout: AGEVideoLayout - if count == 1 { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 1, height: 1))) @@ -60,7 +58,7 @@ extension AGEVideoContainer { return } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] @@ -74,14 +72,14 @@ extension AGEVideoContainer { var layout: AGEVideoLayout - if count > 2 { + if count > 2 { return } else { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 1, height: 0.5))) } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] @@ -95,14 +93,14 @@ extension AGEVideoContainer { var layout: AGEVideoLayout - if count > 2 { + if count > 2 { return } else { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 0.5, height: 1))) } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] @@ -116,19 +114,18 @@ extension AGEVideoContainer { var layout: AGEVideoLayout - if count > 4 { + if count > 4 { return } else { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 0.5, height: 0.5))) } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] } - self.setLayouts([layout]) } @@ -137,19 +134,18 @@ extension AGEVideoContainer { var layout: AGEVideoLayout - if count > 6 { + if count > 6 { return } else { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 0.5, height: 0.33))) } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] } - self.setLayouts([layout]) } @@ -158,19 +154,18 @@ extension AGEVideoContainer { var layout: AGEVideoLayout - if count > 6 { + if count > 6 { return } else { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 0.33, height: 0.5))) } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] } - self.setLayouts([layout]) } @@ -179,19 +174,18 @@ extension AGEVideoContainer { var layout: AGEVideoLayout - if count > 9 { + if count > 9 { return } else { layout = AGEVideoLayout(level: 0) .itemSize(.scale(CGSize(width: 0.33, height: 0.33))) } - self.listCount { (level) -> Int in + self.listCount { _ -> Int in return views.count }.listItem { (index) -> AGEView in return views[index.item] } - self.setLayouts([layout]) } } diff --git a/iOS/APIExample/APIExample/Common/EntryViewController.swift b/iOS/APIExample/APIExample/Common/EntryViewController.swift index 743156ad3..4e323317d 100644 --- a/iOS/APIExample/APIExample/Common/EntryViewController.swift +++ b/iOS/APIExample/APIExample/Common/EntryViewController.swift @@ -9,14 +9,13 @@ import Foundation import UIKit -class EntryViewController : UIViewController -{ +class EntryViewController: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! @IBOutlet weak var noteLabel: UILabel! var note: String = "" - //identifer of next view controller after press join button + // identifer of next view controller after press join button var nextVCIdentifier: String = "" override func viewDidLoad() { @@ -25,15 +24,17 @@ class EntryViewController : UIViewController } @IBAction func doJoinPressed(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: "Main", bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: nextVCIdentifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: nextVCIdentifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName] - self.navigationController?.pushViewController(newViewController, animated: true) + newViewController.configs = ["channelName": channelName] + navigationController?.pushViewController(newViewController, animated: true) } } diff --git a/iOS/APIExample/APIExample/Common/ExternalAudio/AgoraPcmSourcePush.swift b/iOS/APIExample/APIExample/Common/ExternalAudio/AgoraPcmSourcePush.swift index 29b3017fd..ff0693df0 100644 --- a/iOS/APIExample/APIExample/Common/ExternalAudio/AgoraPcmSourcePush.swift +++ b/iOS/APIExample/APIExample/Common/ExternalAudio/AgoraPcmSourcePush.swift @@ -9,7 +9,7 @@ import Foundation protocol AgoraPcmSourcePushDelegate { - func onAudioFrame(data: UnsafeMutablePointer) -> Void + func onAudioFrame(data: UnsafeMutablePointer) } class AgoraPcmSourcePush: NSObject { diff --git a/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.h b/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.h index 17e1cb3a1..8b1c20564 100644 --- a/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.h +++ b/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.h @@ -20,7 +20,12 @@ @property (nonatomic, weak) id delegate; + (instancetype)sharedExternalAudio; -- (void)setupExternalAudioWithAgoraKit:(AgoraRtcEngineKit *)agoraKit sampleRate:(uint)sampleRate channels:(uint)channels audioCRMode:(AudioCRMode)audioCRMode IOType:(IOUnitType)ioType; +- (void)setupExternalAudioWithAgoraKit:(AgoraRtcEngineKit *)agoraKit + sampleRate:(uint)sampleRate + channels:(uint)channels + trackId:(int)trackId + audioCRMode:(AudioCRMode)audioCRMode + IOType:(IOUnitType)ioType; - (void)startWork; - (void)stopWork; @end diff --git a/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.mm b/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.mm index dc628b7fb..4067c6590 100644 --- a/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.mm +++ b/iOS/APIExample/APIExample/Common/ExternalAudio/ExternalAudio.mm @@ -26,6 +26,7 @@ @interface ExternalAudio () @property (nonatomic, assign) int sampleRate; @property (nonatomic, assign) int channelCount; @property (nonatomic, weak) AgoraRtcEngineKit *agoraKit; +@property (nonatomic, assign) int trackId; @end @implementation ExternalAudio @@ -245,7 +246,12 @@ + (instancetype)sharedExternalAudio { return audio; } -- (void)setupExternalAudioWithAgoraKit:(AgoraRtcEngineKit *)agoraKit sampleRate:(uint)sampleRate channels:(uint)channels audioCRMode:(AudioCRMode)audioCRMode IOType:(IOUnitType)ioType { +- (void)setupExternalAudioWithAgoraKit:(AgoraRtcEngineKit *)agoraKit + sampleRate:(uint)sampleRate + channels:(uint)channels + trackId:(int)trackId + audioCRMode:(AudioCRMode)audioCRMode + IOType:(IOUnitType)ioType { threadLockCapture = [[NSObject alloc] init]; threadLockPlay = [[NSObject alloc] init]; @@ -276,6 +282,8 @@ - (void)setupExternalAudioWithAgoraKit:(AgoraRtcEngineKit *)agoraKit sampleRate: self.agoraKit = agoraKit; self.audioCRMode = audioCRMode; + self.trackId = trackId; + self.sampleRate = sampleRate; } - (void)startWork { @@ -303,7 +311,10 @@ - (void)audioController:(AudioController *)controller didCaptureData:(unsigned c } else { // [self.agoraKit pushExternalAudioFrameNSData:[NSData dataWithBytes:data length:bytesLength] sourceId:1 timestamp:0]; - [self.agoraKit pushExternalAudioFrameRawData: data samples: 441 * 10 trackId:1 timestamp:0]; + [self.agoraKit pushExternalAudioFrameRawData: data + samples: self.sampleRate + trackId:self.trackId + timestamp:0]; } } diff --git a/iOS/APIExample/APIExample/Common/ExternalAudio/ZSNBoxingView.m b/iOS/APIExample/APIExample/Common/ExternalAudio/ZSNBoxingView.m index ac26be99a..c2e3ffa95 100644 --- a/iOS/APIExample/APIExample/Common/ExternalAudio/ZSNBoxingView.m +++ b/iOS/APIExample/APIExample/Common/ExternalAudio/ZSNBoxingView.m @@ -208,7 +208,7 @@ - (void)updateItems { - (void)start { if (self.displayLink == nil) { self.displayLink = [CADisplayLink displayLinkWithTarget:_itemLevelCallback selector:@selector(invoke)]; - self.displayLink.frameInterval = 6.f; + self.displayLink.preferredFramesPerSecond = 6.f; [self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes]; } } diff --git a/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraCameraSourcePush.swift b/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraCameraSourcePush.swift index fa2a93783..7adbf36d3 100644 --- a/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraCameraSourcePush.swift +++ b/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraCameraSourcePush.swift @@ -133,21 +133,22 @@ private extension AgoraCameraSourcePush { } func captureDevice(atIndex index: Int) -> AVCaptureDevice? { - let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaType.video, position: .back) + let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], + mediaType: .video, + position: .back) let devices = deviceDiscoverySession.devices let count = devices.count - guard count > 0, index >= 0 else { + guard !devices.isEmpty, index >= 0 else { return nil } let device: AVCaptureDevice - if index >= count { - device = devices.last! + if index >= count, let d = devices.last { + device = d } else { device = devices[index] } - return device } } diff --git a/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraMetalRender.swift b/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraMetalRender.swift index b08bb2ac2..d7af2e947 100644 --- a/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraMetalRender.swift +++ b/iOS/APIExample/APIExample/Common/ExternalVideo/AgoraMetalRender.swift @@ -10,7 +10,6 @@ import CoreMedia import Metal import MetalKit - #if os(iOS) && (!arch(i386) && !arch(x86_64)) import MetalKit #endif @@ -20,7 +19,7 @@ protocol AgoraMetalRenderMirrorDataSource: NSObjectProtocol { func renderViewShouldMirror(renderView: AgoraMetalRender) -> Bool } -enum AgoraVideoRotation:Int { +enum AgoraVideoRotation: Int { /** 0: No rotation */ case rotationNone = 0 /** 1: 90 degrees */ @@ -160,7 +159,10 @@ extension AgoraMetalRender: AgoraVideoFrameDelegate { } func getVideoFormatPreference() -> AgoraVideoFormat { - return .NV12 + .cvPixelNV12 + } + func getObservedFramePosition() -> AgoraVideoFramePosition { + .preRenderer } } @@ -203,8 +205,12 @@ private extension AgoraMetalRender { } #if os(iOS) && (!arch(i386) && !arch(x86_64)) - func texture(pixelBuffer: CVPixelBuffer, textureCache: CVMetalTextureCache?, planeIndex: Int = 0, pixelFormat: MTLPixelFormat = .bgra8Unorm) -> MTLTexture? { - guard let textureCache = textureCache, CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly) == kCVReturnSuccess else { + func texture(pixelBuffer: CVPixelBuffer, + textureCache: CVMetalTextureCache?, + planeIndex: Int = 0, + pixelFormat: MTLPixelFormat = .bgra8Unorm) -> MTLTexture? { + guard let textureCache = textureCache, + CVPixelBufferLockBaseAddress(pixelBuffer, .readOnly) == kCVReturnSuccess else { return nil } defer { @@ -216,7 +222,15 @@ private extension AgoraMetalRender { let height = isPlanar ? CVPixelBufferGetHeightOfPlane(pixelBuffer, planeIndex) : CVPixelBufferGetHeight(pixelBuffer) var imageTexture: CVMetalTexture? - let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, nil, pixelFormat, width, height, planeIndex, &imageTexture) + let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, + textureCache, + pixelBuffer, + nil, + pixelFormat, + width, + height, + planeIndex, + &imageTexture) guard let unwrappedImageTexture = imageTexture, let texture = CVMetalTextureGetTexture(unwrappedImageTexture), @@ -273,7 +287,7 @@ extension AgoraMetalRender: MTKViewDelegate { encoder.popDebugGroup() encoder.endEncoding() - commandBuffer.addScheduledHandler { [weak self] (buffer) in + commandBuffer.addScheduledHandler { [weak self] _ in self?.semaphore.signal() } commandBuffer.present(currentDrawable) @@ -281,10 +295,14 @@ extension AgoraMetalRender: MTKViewDelegate { } } #endif - extension AgoraVideoRotation { - func renderedCoordinates(mirror: Bool, videoSize: CGSize, viewSize: CGSize) -> [simd_float4]? { - guard viewSize.width > 0, viewSize.height > 0, videoSize.width > 0, videoSize.height > 0 else { + func renderedCoordinates(mirror: Bool, + videoSize: CGSize, + viewSize: CGSize) -> [simd_float4]? { + guard viewSize.width > 0, + viewSize.height > 0, + videoSize.width > 0, + videoSize.height > 0 else { return nil } @@ -308,10 +326,10 @@ extension AgoraVideoRotation { y = 1 } - let A = simd_float4( x, -y, 0.0, 1.0 ) - let B = simd_float4( -x, -y, 0.0, 1.0 ) - let C = simd_float4( x, y, 0.0, 1.0 ) - let D = simd_float4( -x, y, 0.0, 1.0 ) + let A = simd_float4(x, -y, 0.0, 1.0) + let B = simd_float4(-x, -y, 0.0, 1.0) + let C = simd_float4(x, y, 0.0, 1.0) + let D = simd_float4(-x, y, 0.0, 1.0) switch self { case .rotationNone: diff --git a/iOS/APIExample/APIExample/Common/GlobalSettings.swift b/iOS/APIExample/APIExample/Common/GlobalSettings.swift index 9f6c20f15..9f2479801 100644 --- a/iOS/APIExample/APIExample/Common/GlobalSettings.swift +++ b/iOS/APIExample/APIExample/Common/GlobalSettings.swift @@ -9,20 +9,19 @@ import Foundation import AgoraRtcKit - -let SCREEN_SHARE_UID_MIN:UInt = 501 -let SCREEN_SHARE_UID_MAX:UInt = 1000 -let SCREEN_SHARE_BROADCASTER_UID_MIN:UInt = 1001 -let SCREEN_SHARE_BROADCASTER_UID_MAX:UInt = 2000 +let SCREEN_SHARE_UID_MIN: UInt = 501 +let SCREEN_SHARE_UID_MAX: UInt = 1000 +let SCREEN_SHARE_BROADCASTER_UID_MIN: UInt = 1001 +let SCREEN_SHARE_BROADCASTER_UID_MAX: UInt = 2000 let SCREEN_SHARE_UID = UInt.random(in: SCREEN_SHARE_UID_MIN...SCREEN_SHARE_UID_MAX) let SCREEN_SHARE_BROADCASTER_UID = UInt.random(in: SCREEN_SHARE_BROADCASTER_UID_MIN...SCREEN_SHARE_BROADCASTER_UID_MAX) -//let SCREEN_SHARE_BROADCASTER_UID = 2000 // As per app group didn't enable in this demo, harded code screen share broadcaster uid +// let SCREEN_SHARE_BROADCASTER_UID = 2000 // As per app group didn't enable in this demo, harded code screen share broadcaster uid struct SettingItemOption { var idx: Int - var label:String - var value:Any + var label: String + var value: Any } class SettingItem { @@ -41,10 +40,13 @@ class SettingItem { class GlobalSettings { // The region for connection. This advanced feature applies to scenarios that have regional restrictions. - // For the regions that Agora supports, see https://docs.agora.io/en/Interactive%20Broadcast/API%20Reference/oc/Constants/AgoraAreaCode.html. After specifying the region, the SDK connects to the Agora servers within that region. - var area:AgoraAreaCodeType = .global + /** For the regions that Agora supports, + * see https://docs.agora.io/en/Interactive%20Broadcast/API%20Reference/oc/Constants/AgoraAreaCode.html. + * After specifying the region, the SDK connects to the Agora servers within that region. + */ + var area: AgoraAreaCodeType = .global static let shared = GlobalSettings() - var settings:[String: SettingItem] = [ + var settings: [String: SettingItem] = [ "resolution": SettingItem(selected: 3, options: [ SettingItemOption(idx: 0, label: "90x90", value: CGSize(width: 90, height: 90)), SettingItemOption(idx: 1, label: "160x120", value: CGSize(width: 160, height: 120)), @@ -67,9 +69,9 @@ class GlobalSettings { "role": SettingItem(selected: 0, options: [ SettingItemOption(idx: 0, label: "broadcaster", value: AgoraClientRole.broadcaster), SettingItemOption(idx: 1, label: "audience", value: AgoraClientRole.audience) - ]), + ]) ] - func getSetting(key:String) -> SettingItem? { + func getSetting(key: String) -> SettingItem? { return settings[key] } diff --git a/iOS/APIExample/APIExample/Common/KeyCenter.swift b/iOS/APIExample/APIExample/Common/KeyCenter.swift index 2c461d515..883759b2b 100644 --- a/iOS/APIExample/APIExample/Common/KeyCenter.swift +++ b/iOS/APIExample/APIExample/Common/KeyCenter.swift @@ -25,7 +25,7 @@ class KeyCenter: NSObject { 杩涘叆澹扮綉鎺у埗鍙(https://console.agora.io/)锛屽垱寤轰竴涓」鐩紝杩涘叆椤圭洰閰嶇疆椤碉紝鍗冲彲鐪嬪埌APP ID銆 */ @objc - static let AppId: String = <#YOUR APPID#> + static let AppId: String = <#YOUR AppId#> /** Certificate. @@ -39,5 +39,8 @@ class KeyCenter: NSObject { 杩涘叆澹扮綉鎺у埗鍙(https://console.agora.io/)锛屽垱寤轰竴涓甫璇佷功閴存潈鐨勯」鐩紝杩涘叆椤圭洰閰嶇疆椤碉紝鍗冲彲鐪嬪埌APP璇佷功銆 娉ㄦ剰锛氬鏋滈」鐩病鏈夊紑鍚瘉涔﹂壌鏉冿紝杩欎釜瀛楁鐣欑┖銆 */ - static let Certificate: String? = <#YOUR Certificate#> + static let Certificate: String? = nil + + + static let FaceCaptureLicense: String? = nil } diff --git a/iOS/APIExample/APIExample/Common/LogViewController.swift b/iOS/APIExample/APIExample/Common/LogViewController.swift index ae52c2ecd..0b5974952 100644 --- a/iOS/APIExample/APIExample/Common/LogViewController.swift +++ b/iOS/APIExample/APIExample/Common/LogViewController.swift @@ -22,14 +22,14 @@ enum LogLevel { } struct LogItem { - var message:String - var level:LogLevel - var dateTime:Date + var message: String + var level: LogLevel + var dateTime: Date } class LogUtils { - static var logs:[LogItem] = [] - static var appLogPath:String = "\(logFolder())/app-\(Date().getFormattedDate(format: "yyyy-MM-dd")).log" + static var logs: [LogItem] = [] + static var appLogPath: String = "\(logFolder())/app-\(Date().getFormattedDate(format: "yyyy-MM-dd")).log" static func log(message: String, level: LogLevel) { LogUtils.logs.append(LogItem(message: message, level: level, dateTime: Date())) @@ -87,11 +87,11 @@ extension LogViewController: UITableViewDataSource { } let logitem = LogUtils.logs[indexPath.row] cell?.textLabel?.font = UIFont.systemFont(ofSize: 12) - cell?.textLabel?.numberOfLines = 0; - cell?.textLabel?.lineBreakMode = .byWordWrapping; + cell?.textLabel?.numberOfLines = 0 + cell?.textLabel?.lineBreakMode = .byWordWrapping let dateFormatterPrint = DateFormatter() dateFormatterPrint.dateFormat = "yyyy-MM-dd HH:mm:ss" cell?.textLabel?.text = "\(dateFormatterPrint.string(from: logitem.dateTime)) - \(logitem.level.description): \(logitem.message)" - return cell! + return cell ?? UITableViewCell() } } diff --git a/iOS/APIExample/APIExample/Common/NetworkManager/JSONObject.swift b/iOS/APIExample/APIExample/Common/NetworkManager/JSONObject.swift index 6f19eb644..c65476705 100644 --- a/iOS/APIExample/APIExample/Common/NetworkManager/JSONObject.swift +++ b/iOS/APIExample/APIExample/Common/NetworkManager/JSONObject.swift @@ -17,7 +17,9 @@ class JSONObject { static func toModel(_ type: T.Type, value: Any) -> T? { guard let data = try? JSONSerialization.data(withJSONObject: value) else { return nil } let decoder = JSONDecoder() - decoder.nonConformingFloatDecodingStrategy = .convertFromString(positiveInfinity: "+Infinity", negativeInfinity: "-Infinity", nan: "NaN") + decoder.nonConformingFloatDecodingStrategy = .convertFromString(positiveInfinity: "+Infinity", + negativeInfinity: "-Infinity", + nan: "NaN") return try? decoder.decode(type, from: data) } /// JSON瀛楃涓茶浆妯″瀷 @@ -28,8 +30,14 @@ class JSONObject { /// JSON瀛楃涓茶浆妯″瀷 static func toModel(_ type: T.Type, value: String) -> T? { let decoder = JSONDecoder() - decoder.nonConformingFloatDecodingStrategy = .convertFromString(positiveInfinity: "+Infinity", negativeInfinity: "-Infinity", nan: "NaN") - guard let t = try? decoder.decode(T.self, from: value.data(using: .utf8)!) else { return nil } + decoder.nonConformingFloatDecodingStrategy = .convertFromString(positiveInfinity: "+Infinity", + negativeInfinity: "-Infinity", + nan: "NaN") + guard let data = value.data(using: .utf8), + let t = try? decoder.decode(T.self, from: data) + else { + return nil + } return t } /// 妯″瀷杞琂SON瀛楃涓 @@ -52,13 +60,22 @@ class JSONObject { /// JSON瀛楃涓茶浆瀛楀吀 static func toDictionary(jsonString: String) -> [String: Any] { guard let jsonData = jsonString.data(using: .utf8) else { return [:] } - guard let dict = try? JSONSerialization.jsonObject(with: jsonData, options: .mutableContainers), let result = dict as? [String: Any] else { return [:] } + guard let dict = try? JSONSerialization.jsonObject(with: jsonData, + options: .mutableContainers), + let result = dict as? [String: Any] + else { + return [:] + } return result } /// JSON瀛楃涓茶浆瀛楀吀 static func toDictionary(jsonStr: String) -> [String: String] { guard let jsonData = jsonStr.data(using: .utf8) else { return [:] } - guard let dict = try? JSONSerialization.jsonObject(with: jsonData, options: .mutableContainers), let result = dict as? [String: Any] else { return [:] } + guard let dict = try? JSONSerialization.jsonObject(with: jsonData, options: .mutableContainers), + let result = dict as? [String: Any] + else { + return [:] + } var data = [String: String]() for item in result { data[item.key] = "\(item.value)" @@ -68,13 +85,18 @@ class JSONObject { /// JSON瀛楃涓茶浆鏁扮粍 static func toArray(jsonString: String) -> [[String: Any]]? { guard let jsonData = jsonString.data(using: .utf8) else { return nil } - guard let array = try? JSONSerialization.jsonObject(with: jsonData, options: .mutableContainers), let result = array as? [[String: Any]] else { return nil } + guard let array = try? JSONSerialization.jsonObject(with: jsonData, + options: .mutableContainers), + let result = array as? [[String: Any]] + else { + return nil + } return result } /// 瀛楀吀杞琂SON瀛楃涓 static func toJsonString(dict: [String: Any]?) -> String? { guard let dict = dict else { return nil } - if (!JSONSerialization.isValidJSONObject(dict)) { + if !JSONSerialization.isValidJSONObject(dict) { print("瀛楃涓叉牸寮忛敊璇紒") return nil } @@ -90,7 +112,7 @@ class JSONObject { let count = array.count for dict in array { guard let dict = dict else { return nil } - if (!JSONSerialization.isValidJSONObject(dict)) { + if !JSONSerialization.isValidJSONObject(dict) { print("瀛楃涓叉牸寮忛敊璇紒") return nil } @@ -100,7 +122,7 @@ class JSONObject { if i < count - 1 { jsonString.append(",") } - i = i + 1 + i += 1 } jsonString.append("]") return jsonString @@ -111,7 +133,7 @@ extension String { func toArray() -> [[String: Any]]? { JSONObject.toArray(jsonString: self) } - func toDictionary() -> [String : String] { + func toDictionary() -> [String: String] { JSONObject.toDictionary(jsonStr: self) } } diff --git a/iOS/APIExample/APIExample/Common/NetworkManager/NetworkManager.swift b/iOS/APIExample/APIExample/Common/NetworkManager/NetworkManager.swift index 99517c753..14c2a52eb 100644 --- a/iOS/APIExample/APIExample/Common/NetworkManager/NetworkManager.swift +++ b/iOS/APIExample/APIExample/Common/NetworkManager/NetworkManager.swift @@ -10,8 +10,8 @@ import UIKit @objc class NetworkManager: NSObject { enum HTTPMethods: String { - case GET = "GET" - case POST = "POST" + case GET + case POST } typealias SuccessClosure = ([String: Any]) -> Void @@ -47,7 +47,7 @@ class NetworkManager: NSObject { "src": "iOS", "ts": "".timeStamp, "type": 1, - "uid": "\(uid)"] as [String : Any] + "uid": "\(uid)"] as [String: Any] ToastView.showWait(text: "loading...", view: nil) NetworkManager.shared.postRequest(urlString: "https://toolbox.bj2.agoralab.co/v1/token/generate", params: params, success: { response in let data = response["data"] as? [String: String] @@ -78,8 +78,9 @@ class NetworkManager: NSObject { return } DispatchQueue.global().async { - let downloadTask = session.downloadTask(with: request) { location, response, error in - let locationPath = location!.path + let downloadTask = session.downloadTask(with: request) { location, _, _ in + guard let location = location else { return } + let locationPath = location.path let fileManager = FileManager.default try? fileManager.moveItem(atPath: locationPath, toPath: documnets) success?(["fileName": fileName, "path": documnets]) @@ -105,7 +106,7 @@ class NetworkManager: NSObject { method: method, success: success, failure: failure) else { return } - session.dataTask(with: request) { data, response, error in + session.dataTask(with: request) { data, response, _ in DispatchQueue.main.async { self.checkResponse(response: response, data: data, success: success, failure: failure) } @@ -125,7 +126,8 @@ class NetworkManager: NSObject { var request = URLRequest(url: url) request.httpMethod = method.rawValue if method == .POST { - request.httpBody = try? JSONSerialization.data(withJSONObject: params ?? [], options: .sortedKeys)//convertParams(params: params).data(using: .utf8) + request.httpBody = try? JSONSerialization.data(withJSONObject: params ?? [], + options: .sortedKeys) } let curl = request.cURL(pretty: true) debugPrint("curl == \(curl)") @@ -157,7 +159,7 @@ class NetworkManager: NSObject { } extension URLRequest { - public func cURL(pretty: Bool = false) -> String { + func cURL(pretty: Bool = false) -> String { let newLine = pretty ? "\\\n" : "" let method = (pretty ? "--request " : "-X ") + "\(httpMethod ?? "GET") \(newLine)" let url: String = (pretty ? "--url " : "") + "\'\(url?.absoluteString ?? "")\' \(newLine)" @@ -166,8 +168,8 @@ extension URLRequest { var header = "" var data: String = "" - if let httpHeaders = allHTTPHeaderFields, httpHeaders.keys.count > 0 { - for (key,value) in httpHeaders { + if let httpHeaders = allHTTPHeaderFields, !httpHeaders.keys.isEmpty { + for (key, value) in httpHeaders { header += (pretty ? "--header " : "-H ") + "\'\(key): \(value)\' \(newLine)" } } diff --git a/iOS/APIExample/APIExample/Common/NetworkManager/ToastView.swift b/iOS/APIExample/APIExample/Common/NetworkManager/ToastView.swift index e722dddf8..d959c1b09 100644 --- a/iOS/APIExample/APIExample/Common/NetworkManager/ToastView.swift +++ b/iOS/APIExample/APIExample/Common/NetworkManager/ToastView.swift @@ -215,7 +215,7 @@ class ToastView: UIView { label.leadingAnchor.constraint(equalTo: tagImageView.trailingAnchor, constant: 5).isActive = true label.topAnchor.constraint(equalTo: topAnchor, constant: 10).isActive = true label.trailingAnchor.constraint(equalTo: trailingAnchor, constant: -10).isActive = true - label.bottomAnchor.constraint(equalTo: bottomAnchor,constant: -10).isActive = true + label.bottomAnchor.constraint(equalTo: bottomAnchor, constant: -10).isActive = true } } extension UIViewController { @@ -239,18 +239,15 @@ extension UIViewController { let viewController = viewController ?? keyWindow?.rootViewController if let navigationController = viewController as? UINavigationController, - !navigationController.viewControllers.isEmpty - { + !navigationController.viewControllers.isEmpty { return self.cl_topViewController(navigationController.viewControllers.last) } else if let tabBarController = viewController as? UITabBarController, - let selectedController = tabBarController.selectedViewController - { + let selectedController = tabBarController.selectedViewController { return self.cl_topViewController(selectedController) } else if let presentedController = viewController?.presentedViewController { return self.cl_topViewController(presentedController) - } return viewController } diff --git a/iOS/APIExample/APIExample/Common/PickerView.swift b/iOS/APIExample/APIExample/Common/PickerView.swift index b79ad961a..b9490b4fe 100644 --- a/iOS/APIExample/APIExample/Common/PickerView.swift +++ b/iOS/APIExample/APIExample/Common/PickerView.swift @@ -97,7 +97,7 @@ class PickerView: UIView { } @objc private func onTapSureButton() { - pickerViewSelectedValueClosure?(selectedValue ?? "") + pickerViewSelectedValueClosure?(selectedValue ?? dataArray?.first ?? "") AlertManager.hiddenView() } } diff --git a/iOS/APIExample/APIExample/Common/Settings/SettingsCells.swift b/iOS/APIExample/APIExample/Common/Settings/SettingsCells.swift index 691e53187..d3735c855 100644 --- a/iOS/APIExample/APIExample/Common/Settings/SettingsCells.swift +++ b/iOS/APIExample/APIExample/Common/Settings/SettingsCells.swift @@ -8,38 +8,35 @@ import Foundation -class SettingsBaseCell : UITableViewCell -{ - var configs:SettingsBaseParam? - weak var delegate:SettingsViewControllerDelegate? - func configure(configs:SettingsBaseParam){ +class SettingsBaseCell: UITableViewCell { + var configs: SettingsBaseParam? + weak var delegate: SettingsViewControllerDelegate? + func configure(configs: SettingsBaseParam) { self.configs = configs } } -class SettingsBaseParam: NSObject -{ - var key:String - var label:String - var type:String +class SettingsBaseParam: NSObject { + var key: String + var label: String + var type: String - init(key:String, label:String, type:String) { + init(key: String, label: String, type: String) { self.key = key self.label = label self.type = type } } -class SettingsSliderCell : SettingsBaseCell -{ +class SettingsSliderCell: SettingsBaseCell { @IBOutlet weak var settingLabel: UILabel! @IBOutlet weak var settingSlider: UISlider! @IBOutlet weak var settingValue: UILabel! - @IBAction func onSliderValueChanged(sender:UISlider){ - let val = (sender.value*100).rounded()/100 + @IBAction func onSliderValueChanged(sender: UISlider) { + let val = (sender.value * 100).rounded() / 100 settingValue.text = "\(val)" - guard let configs = self.configs as? SettingsSliderParam else {return} + guard let configs = self.configs as? SettingsSliderParam else { return } delegate?.didChangeValue(type: "SettingsSliderCell", key: configs.key, value: val) } @@ -56,10 +53,10 @@ class SettingsSliderCell : SettingsBaseCell } class SettingsSliderParam: SettingsBaseParam { - var value:Float - var minimumValue:Float - var maximumValue:Float - init(key:String, label:String, value:Float, minimumValue:Float, maximumValue:Float) { + var value: Float + var minimumValue: Float + var maximumValue: Float + init(key: String, label: String, value: Float, minimumValue: Float, maximumValue: Float) { self.value = value self.minimumValue = minimumValue self.maximumValue = maximumValue @@ -67,54 +64,54 @@ class SettingsSliderParam: SettingsBaseParam { } } - -class SettingsLabelCell : SettingsBaseCell -{ +class SettingsLabelCell: SettingsBaseCell { @IBOutlet weak var settingLabel: UILabel! @IBOutlet weak var settingValue: UILabel! override func configure(configs: SettingsBaseParam) { super.configure(configs: configs) - guard let param = configs as? SettingsLabelParam else {return} + guard let param = configs as? SettingsLabelParam else { return } settingLabel.text = param.label settingValue.text = param.value } } class SettingsLabelParam: SettingsBaseParam { - var value:String - init(key:String, label:String, value:String) { + var value: String + init(key: String, label: String, value: String) { self.value = value super.init(key: key, label: label, type: "LabelCell") } } -class SettingsSelectCell : SettingsBaseCell -{ +class SettingsSelectCell: SettingsBaseCell { @IBOutlet weak var settingLabel: UILabel! @IBOutlet weak var settingBtn: UIButton! override func configure(configs: SettingsBaseParam) { super.configure(configs: configs) - guard let param = configs as? SettingsSelectParam else {return} + guard let param = configs as? SettingsSelectParam else { return } settingLabel.text = param.label settingBtn.setTitle(param.value, for: .normal) } - func getSelectAction(_ option:SettingItemOption) -> UIAlertAction { - return UIAlertAction(title: "\(option.label)", style: .default, handler: {[unowned self] action in - guard let param = self.configs as? SettingsSelectParam else {return} + func getSelectAction(_ option: SettingItemOption) -> UIAlertAction { + return UIAlertAction(title: "\(option.label)", style: .default, handler: { [unowned self] _ in + guard let param = self.configs as? SettingsSelectParam else { return } self.settingBtn.setTitle(option.label, for: .normal) param.settingItem.selected = option.idx self.delegate?.didChangeValue(type: "SettingsSelectCell", key: param.key, value: param.settingItem) }) } - @IBAction func onSelect(_ sender:UIButton) { - let alert = UIAlertController(title: nil, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) - guard let param = configs as? SettingsSelectParam else {return} + @IBAction func onSelect(_ sender: UIButton) { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: nil, + message: nil, + preferredStyle: style) + guard let param = configs as? SettingsSelectParam else { return } for option in param.settingItem.options { alert.addAction(getSelectAction(option)) } @@ -124,10 +121,10 @@ class SettingsSelectCell : SettingsBaseCell } class SettingsSelectParam: SettingsBaseParam { - var value:String - var settingItem:SettingItem - weak var context:UIViewController?; - init(key:String, label:String, settingItem:SettingItem, context:UIViewController) { + var value: String + var settingItem: SettingItem + weak var context: UIViewController? + init(key: String, label: String, settingItem: SettingItem, context: UIViewController) { self.settingItem = settingItem self.context = context self.value = settingItem.selectedOption().label @@ -135,22 +132,21 @@ class SettingsSelectParam: SettingsBaseParam { } } -class SettingsTextFieldCell : SettingsBaseCell -{ +class SettingsTextFieldCell: SettingsBaseCell { @IBOutlet weak var settingLabel: UILabel! @IBOutlet weak var textField: UITextField! override func configure(configs: SettingsBaseParam) { super.configure(configs: configs) - guard let param = configs as? SettingsTextFieldParam else {return} + guard let param = configs as? SettingsTextFieldParam else { return } settingLabel.text = param.label textField.placeholder = param.placeholder textField.text = param.text } @IBAction func onTextFieldChanged(_ sender: UITextField) { - guard let configs = self.configs as? SettingsTextFieldParam else {return} + guard let configs = self.configs as? SettingsTextFieldParam else { return } delegate?.didChangeValue(type: "SettingsTextFieldCell", key: configs.key, value: sender.text ?? "") } } @@ -165,22 +161,23 @@ class SettingsTextFieldParam: SettingsBaseParam { } } -class SettingsSwitchCell : SettingsBaseCell -{ +class SettingsSwitchCell: SettingsBaseCell { @IBOutlet weak var settingLabel: UILabel! @IBOutlet weak var uiSwitch: UISwitch! override func configure(configs: SettingsBaseParam) { super.configure(configs: configs) - guard let param = configs as? SettingsSwitchParam else {return} + guard let param = configs as? SettingsSwitchParam else { return } settingLabel.text = param.label uiSwitch.isOn = param.isOn } - @IBAction func onSwitchChanged(sender: UISwitch){ - guard let configs = self.configs as? SettingsSwitchParam else {return} - delegate?.didChangeValue(type: "SettingsSwitchCell", key: configs.key, value: "\(uiSwitch.isOn ? 1 : 0)") + @IBAction func onSwitchChanged(sender: UISwitch) { + guard let configs = self.configs as? SettingsSwitchParam else { return } + delegate?.didChangeValue(type: "SettingsSwitchCell", + key: configs.key, + value: "\(uiSwitch.isOn ? 1 : 0)") } } diff --git a/iOS/APIExample/APIExample/Common/Settings/SettingsViewController.swift b/iOS/APIExample/APIExample/Common/Settings/SettingsViewController.swift index 5bcb229ae..32605dcea 100644 --- a/iOS/APIExample/APIExample/Common/Settings/SettingsViewController.swift +++ b/iOS/APIExample/APIExample/Common/Settings/SettingsViewController.swift @@ -10,14 +10,13 @@ import Foundation import UIKit protocol SettingsViewControllerDelegate: AnyObject { - func didChangeValue(type:String, key:String, value: Any) + func didChangeValue(type: String, key: String, value: Any) } -class SettingsViewController : UITableViewController -{ - var sections:[[SettingsBaseParam]] = [] - var sectionNames:[String] = [] - weak var settingsDelegate:SettingsViewControllerDelegate? +class SettingsViewController: UITableViewController { + var sections: [[SettingsBaseParam]] = [] + var sectionNames: [String] = [] + weak var settingsDelegate: SettingsViewControllerDelegate? override func numberOfSections(in tableView: UITableView) -> Int { return sections.count } @@ -29,7 +28,9 @@ class SettingsViewController : UITableViewController override func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell { let param = sections[indexPath.section][indexPath.row] - guard let cell = tableView.dequeueReusableCell(withIdentifier: param.type, for: indexPath) as? SettingsBaseCell else {return UITableViewCell()} + guard let cell = tableView.dequeueReusableCell(withIdentifier: param.type, for: indexPath) as? SettingsBaseCell else { + return UITableViewCell() + } cell.delegate = settingsDelegate cell.configure(configs: param) diff --git a/iOS/APIExample/APIExample/Common/StatisticsInfo.swift b/iOS/APIExample/APIExample/Common/StatisticsInfo.swift index 0a410bbdf..49a5795ad 100755 --- a/iOS/APIExample/APIExample/Common/StatisticsInfo.swift +++ b/iOS/APIExample/APIExample/Common/StatisticsInfo.swift @@ -33,11 +33,12 @@ struct StatisticsInfo { } var dimension = CGSize.zero - var fps:UInt = 0 + var fps: UInt = 0 var firstFrameElapsedTime: Double = 0 var preloadElapsedTime: Double = 0 var uid: UInt = 0 var remoteUid: UInt = 0 + var metaInfo: String? var type: StatisticsType @@ -124,7 +125,11 @@ struct StatisticsInfo { preloadElapsedTime = Double(info.join2JoinSuccess) } - func description(audioOnly:Bool) -> String { + mutating func updateMetaInfo(data: String?) { + metaInfo = data + } + + func description(audioOnly: Bool) -> String { var full: String switch type { case .local(let info): full = localDescription(info: info, audioOnly: audioOnly) @@ -142,7 +147,7 @@ struct StatisticsInfo { let videoSend = "VSend: \(info.videoStats.sentBitrate)kbps" let audioSend = "ASend: \(info.audioStats.sentBitrate)kbps" let cpu = "CPU: \(info.channelStats.cpuAppUsage)%/\(info.channelStats.cpuTotalUsage)%" - //TODO + // TODO // let vSendLoss = "VSend Loss: \(info.videoStats.txPacketLossRate)%" // let aSendLoss = "ASend Loss: \(info.audioStats.txPacketLossRate)%" let vSendLoss = "VSend Loss: MISSING%" @@ -150,11 +155,18 @@ struct StatisticsInfo { let firstFrame = "firstFrameTime: \(firstFrameElapsedTime)" - if(audioOnly) { - let array = firstFrameElapsedTime > 0 ? [firstFrame, lastmile,audioSend,cpu,aSendLoss] : [lastmile,audioSend,cpu,aSendLoss] + if audioOnly { + let array = firstFrameElapsedTime > 0 + ? [firstFrame, lastmile, audioSend, cpu, aSendLoss] + : [lastmile, audioSend, cpu, aSendLoss] return array.joined(separator: "\n") } - var array = firstFrameElapsedTime > 0 ? [localUid, firstFrame, dimensionFps,lastmile,videoSend,audioSend,cpu,vSendLoss,aSendLoss] : [localUid, dimensionFps,lastmile,videoSend,audioSend,cpu,vSendLoss,aSendLoss] + var array = firstFrameElapsedTime > 0 + ? [localUid, firstFrame, dimensionFps, lastmile, videoSend, audioSend, cpu, vSendLoss, aSendLoss] + : [localUid, dimensionFps, lastmile, videoSend, audioSend, cpu, vSendLoss, aSendLoss] + if let metaInfo = metaInfo { + array.append(metaInfo) + } return array.joined(separator: "\n") } @@ -176,11 +188,15 @@ struct StatisticsInfo { let audioLoss = "ALoss: \(info.audioStats.audioLossRate)%" let aquality = "AQuality: \(audioQuality.description())" let preloadTime = "join2Success: \(preloadElapsedTime)" - if(audioOnly) { - let array = firstFrameElapsedTime > 0 ? [firstFrame, audioRecv,audioLoss,aquality] : [audioRecv,audioLoss,aquality] + if audioOnly { + let array = firstFrameElapsedTime > 0 + ? [firstFrame, audioRecv, audioLoss, aquality] + : [audioRecv, audioLoss, aquality] return array.joined(separator: "\n") } - var array = firstFrameElapsedTime > 0 ? [uid,firstFrame, dimensionFpsBit,videoRecv,audioRecv,videoLoss,audioLoss,aquality, preloadTime] : [uid, dimensionFpsBit,videoRecv,audioRecv,videoLoss,audioLoss,aquality, preloadTime] + var array = firstFrameElapsedTime > 0 + ? [uid, firstFrame, dimensionFpsBit, videoRecv, audioRecv, videoLoss, audioLoss, aquality, preloadTime] + : [uid, dimensionFpsBit, videoRecv, audioRecv, videoLoss, audioLoss, aquality, preloadTime] if preloadElapsedTime <= 0 { array.removeLast() } diff --git a/iOS/APIExample/APIExample/Common/UITypeAlias.swift b/iOS/APIExample/APIExample/Common/UITypeAlias.swift index 033ee948a..8b33ee391 100644 --- a/iOS/APIExample/APIExample/Common/UITypeAlias.swift +++ b/iOS/APIExample/APIExample/Common/UITypeAlias.swift @@ -38,23 +38,20 @@ enum Font { case boldItalic = "BoldItalic" func with(size: CGFloat) -> UIFont { - return UIFont(name: "HelveticaNeue-\(rawValue)", size: size)! + UIFont(name: "HelveticaNeue-\(rawValue)", size: size) ?? .systemFont(ofSize: size) } } } extension UIColor { - /// Get color rgba components in order. func rgba() -> (r: CGFloat, g: CGFloat, b: CGFloat, a: CGFloat) { - let components = self.cgColor.components + guard let components = self.cgColor.components else { return (0, 0, 0, 0) } let numberOfComponents = self.cgColor.numberOfComponents switch numberOfComponents { - case 4: - return (components![0], components![1], components![2], components![3]) - case 2: - return (components![0], components![0], components![0], components![1]) + case 4: return (components[0], components[1], components[2], components[3]) + case 2: return (components[0], components[0], components[0], components[1]) default: // FIXME: Fallback to black return (0, 0, 0, 1) @@ -64,13 +61,12 @@ extension UIColor { /// Check the black or white contrast on given color. func blackOrWhiteContrastingColor() -> Color { let rgbaT = rgba() - let value = 1 - ((0.299 * rgbaT.r) + (0.587 * rgbaT.g) + (0.114 * rgbaT.b)); + let value = 1 - ((0.299 * rgbaT.r) + (0.587 * rgbaT.g) + (0.114 * rgbaT.b)) return value < 0.5 ? Color.black : Color.white } - } -enum AssetsColor : String { +enum AssetsColor: String { case videoBackground case videoPlaceholder case textShadow @@ -78,12 +74,11 @@ enum AssetsColor : String { extension UIColor { static func appColor(_ name: AssetsColor) -> UIColor? { - return UIColor(named: name.rawValue) + UIColor(named: name.rawValue) } } extension UIView { - /// Adds constraints to this `UIView` instances `superview` object to make sure this always has the same size as the superview. /// Please note that this has no effect if its `superview` is `nil` 鈥 add this `UIView` instance as a subview before calling this. func bindFrameToSuperviewBounds() { @@ -91,17 +86,15 @@ extension UIView { print("Error! `superview` was nil 鈥 call `addSubview(view: UIView)` before calling `bindFrameToSuperviewBounds()` to fix this.") return } - self.translatesAutoresizingMaskIntoConstraints = false self.topAnchor.constraint(equalTo: superview.topAnchor, constant: 0).isActive = true self.bottomAnchor.constraint(equalTo: superview.bottomAnchor, constant: 0).isActive = true self.leadingAnchor.constraint(equalTo: superview.leadingAnchor, constant: 0).isActive = true self.trailingAnchor.constraint(equalTo: superview.trailingAnchor, constant: 0).isActive = true - } } -//MARK: - Color +// MARK: - Color #if os(iOS) typealias AGColor = UIColor #else @@ -133,17 +126,16 @@ extension AGColor { convenience init(hex: String, alpha: CGFloat = 1) { var cString: String = hex.trimmingCharacters(in: .whitespacesAndNewlines).uppercased() - if (cString.hasPrefix("#")) { + if cString.hasPrefix("#") { let range = cString.index(after: cString.startIndex) ..< cString.endIndex cString = String(cString[range]) } - if (cString.hasPrefix("0X")) { + if cString.hasPrefix("0X") { let range = cString.index(cString.startIndex, offsetBy: 2) ..< cString.endIndex cString = String(cString[range]) } - - if (cString.count != 6) { + if cString.count != 6 { self.init() return } @@ -155,19 +147,21 @@ extension AGColor { } static func randomColor() -> AGColor { + // swiftlint:disable legacy_random let randomHex = Int(arc4random_uniform(0xCCCCCC) + 0x555555) + // swiftlint:enable legacy_random return AGColor(hex: randomHex) } } -//MARK: - Font +// MARK: - Font #if os(iOS) typealias AGFont = UIFont #else typealias AGFont = NSFont #endif -//MARK: - Image +// MARK: - Image #if os(iOS) typealias AGImage = UIImage #else @@ -258,7 +252,7 @@ extension AGLabel { #endif } -//MARK: - TextField +// MARK: - TextField #if os(iOS) typealias AGTextField = UITextField #else @@ -326,7 +320,7 @@ extension AGTextField { var stringValue: String { get { - return text! + return text ?? "" } set { text = newValue @@ -351,7 +345,7 @@ extension AGTextField { } } -//MARK: - Indicator +// MARK: - Indicator #if os(iOS) typealias AGIndicator = UIActivityIndicatorView #else @@ -378,7 +372,7 @@ extension AGIndicator { } -//MARK: - View +// MARK: - View #if os(iOS) typealias AGView = UIView #else @@ -500,35 +494,34 @@ extension AGView { #endif } - #if os(iOS) typealias AGVisualEffectView = UIVisualEffectView #else typealias AGVisualEffectView = NSVisualEffectView #endif -//MARK: - ImageView +// MARK: - ImageView #if os(iOS) typealias AGImageView = UIImageView #else typealias AGImageView = NSImageView #endif -//MARK: - TableView +// MARK: - TableView #if os(iOS) typealias AGTableView = UITableView #else typealias AGTableView = NSTableView #endif -//MARK: - TableViewCell +// MARK: - TableViewCell #if os(iOS) typealias AGTableViewCell = UITableViewCell #else typealias AGTableViewCell = NSTableCellView #endif -//MARK: - CollectionView +// MARK: - CollectionView #if os(iOS) typealias AGCollectionView = UICollectionView #else @@ -541,7 +534,7 @@ typealias AGCollectionViewFlowLayout = UICollectionViewFlowLayout typealias AGCollectionViewFlowLayout = NSCollectionViewFlowLayout #endif -//MARK: - CollectionViewCell +// MARK: - CollectionViewCell #if os(iOS) typealias AGCollectionViewCell = UICollectionViewCell #else @@ -561,7 +554,7 @@ extension AGCollectionViewCell { #endif } -//MARK: - Button +// MARK: - Button #if os(iOS) typealias AGButton = UIButton #else @@ -602,7 +595,9 @@ extension AGButton { set { let pstyle = NSMutableParagraphStyle() pstyle.alignment = .left - attributedTitle = NSAttributedString(string: title, attributes: [ NSAttributedString.Key.foregroundColor : newValue, NSAttributedString.Key.paragraphStyle : pstyle ]) + attributedTitle = NSAttributedString(string: title, + attributes: [ NSAttributedString.Key.foregroundColor: newValue, + NSAttributedString.Key.paragraphStyle: pstyle ]) } } #endif @@ -612,26 +607,26 @@ extension AGButton { UIView.animate(withDuration: 0.15, animations: { self.isEnabled = false self.alpha = 0.3 - }) { (_) in + }, completion: { _ in self.image = toImage self.alpha = 1.0 self.isEnabled = true - } + }) #else NSAnimationContext.runAnimationGroup({ (context) in context.duration = 0.3 self.isEnabled = false self.animator().alphaValue = 0.3 - }) { + }, completion: { self.image = toImage self.alphaValue = 1.0 self.isEnabled = true - } + }) #endif } } -//MARK: - Switch +// MARK: - Switch #if os(iOS) typealias AGSwitch = UISwitch #else @@ -650,7 +645,7 @@ extension AGSwitch { } #endif -//MARK: - WebView +// MARK: - WebView #if os(iOS) typealias AGWebView = WKWebView #else @@ -666,7 +661,7 @@ extension AGWebView { } #endif -//MARK: - Slider +// MARK: - Slider #if os(iOS) typealias AGSlider = UISlider #else @@ -726,14 +721,14 @@ extension AGSlider { #endif } -//MARK: - SegmentedControl +// MARK: - SegmentedControl #if os(iOS) typealias AGPopSheetButton = UIButton #else typealias AGPopSheetButton = NSPopUpButton #endif -//MARK: - SegmentedControl +// MARK: - SegmentedControl #if os(iOS) typealias AGSegmentedControl = UISegmentedControl #else @@ -752,7 +747,7 @@ extension AGSegmentedControl { } #endif -//MARK: - StoryboardSegue +// MARK: - StoryboardSegue #if os(iOS) typealias AGStoryboardSegue = UIStoryboardSegue #else @@ -760,32 +755,24 @@ typealias AGStoryboardSegue = NSStoryboardSegue #endif extension AGStoryboardSegue { var identifierString: String? { - get { - #if os(iOS) - return identifier - #else - return identifier - #endif - } + return identifier } #if os(iOS) var destinationController: AGViewController? { - get { - return destination - } + return destination } #endif } -//MARK: - Storyboard +// MARK: - Storyboard #if os(iOS) typealias AGStoryboard = UIStoryboard #else typealias AGStoryboard = NSStoryboard #endif -//MARK: - ViewController +// MARK: - ViewController #if os(iOS) typealias AGViewController = UIViewController #else @@ -823,14 +810,13 @@ extension AGViewController { } } -//MARK: - TableViewController +// MARK: - TableViewController #if os(iOS) typealias AGTableViewController = UITableViewController #else typealias AGTableViewController = NSViewController #endif - #if os(iOS) typealias AGBezierPath = UIBezierPath #else @@ -856,7 +842,6 @@ typealias AGControl = UIControl typealias AGControl = NSControl #endif - #if os(OSX) extension String { func buttonWhiteAttributedTitleString() -> NSAttributedString { @@ -870,11 +855,10 @@ extension String { fileprivate func buttonAttributedTitleStringWithColor(_ color: AGColor) -> NSAttributedString { let attributes = [NSAttributedString.Key.foregroundColor: color, NSAttributedString.Key.font: NSFont.systemFont(ofSize: 13)] let attributedString = NSMutableAttributedString(string: self) - let range = NSMakeRange(0, attributedString.length) + let range = NSRange(location: 0, length: attributedString.length) attributedString.addAttributes(attributes, range: range) attributedString.setAlignment(.center, range: range) attributedString.fixAttributes(in: range) - return attributedString } } @@ -885,4 +869,3 @@ typealias AGApplication = UIApplication #else typealias AGApplication = NSApplication #endif - diff --git a/iOS/APIExample/APIExample/Common/Utils/Util.swift b/iOS/APIExample/APIExample/Common/Utils/Util.swift index c776dc8b6..f43acbcf7 100644 --- a/iOS/APIExample/APIExample/Common/Utils/Util.swift +++ b/iOS/APIExample/APIExample/Common/Utils/Util.swift @@ -7,9 +7,9 @@ // import UIKit +import AgoraRtcKit enum Util { - /// Configuring Privatization Parameters static func configPrivatization(agoraKit: AgoraRtcEngineKit) { if !GlobalSettings.shared.getCache(key: "ip").isEmpty { @@ -31,17 +31,3 @@ enum Util { } } } - -struct Throttle { - static var timer: Timer? - static func throttle(_ interval: TimeInterval, block: @escaping () -> Void) { - guard timer == nil else { return } - timer = Timer.scheduledTimer(withTimeInterval: interval, repeats: false) { t in - // 涓绉掑唴鍙墽琛屼竴娆℃渶鍚庝竴娆¤Е鍙戠殑鍑芥暟 - t.invalidate() - timer = nil - block() - } - RunLoop.current.add(timer!, forMode: .common) - } -} diff --git a/iOS/APIExample/APIExample/Common/VideoView.swift b/iOS/APIExample/APIExample/Common/VideoView.swift index 46c209f18..e2b809fb4 100644 --- a/iOS/APIExample/APIExample/Common/VideoView.swift +++ b/iOS/APIExample/APIExample/Common/VideoView.swift @@ -9,20 +9,18 @@ import UIKit extension Bundle { - static func loadView(fromNib name: String, withType type: T.Type) -> T { if let view = Bundle.main.loadNibNamed(name, owner: nil, options: nil)?.first as? T { return view } - fatalError("Could not load view with type " + String(describing: type)) } - static func loadVideoView(type:VideoView.StreamType, audioOnly:Bool) -> VideoView { + static func loadVideoView(type: VideoView.StreamType, audioOnly: Bool) -> VideoView { let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) view.audioOnly = audioOnly view.type = type - if(type.isLocal()) { + if type.isLocal() { view.statsInfo = StatisticsInfo(type: .local(StatisticsInfo.LocalInfo())) } else { view.statsInfo = StatisticsInfo(type: .remote(StatisticsInfo.RemoteInfo())) @@ -32,36 +30,41 @@ extension Bundle { } class VideoView: UIView { - - @IBOutlet weak var videoView:UIView! - @IBOutlet weak var placeholderLabel:UILabel! - @IBOutlet weak var infoLabel:UILabel! - @IBOutlet weak var statsLabel:UILabel! - var audioOnly:Bool = false - var uid:UInt = 0 + @IBOutlet weak var videoView: UIView! + @IBOutlet weak var placeholderLabel: UILabel! + @IBOutlet weak var infoLabel: UILabel! + @IBOutlet weak var statsLabel: UILabel! + var audioOnly: Bool = false + var uid: UInt = 0 enum StreamType { case local case remote - func isLocal() -> Bool{ + func isLocal() -> Bool { switch self { case .local: return true case .remote: return false } } } - var statsInfo:StatisticsInfo? { - didSet{ - statsLabel.text = statsInfo?.description(audioOnly: audioOnly) + var statsInfo: StatisticsInfo? { + didSet { + if Thread.isMainThread { + statsLabel.text = statsInfo?.description(audioOnly: audioOnly) + } else { + DispatchQueue.main.async { + self.statsLabel.text = self.statsInfo?.description(audioOnly: self.audioOnly) + } + } } } - var type:StreamType? + var type: StreamType? - func setPlaceholder(text:String) { + func setPlaceholder(text: String) { placeholderLabel.text = text } - func setInfo(text:String) { + func setInfo(text: String) { infoLabel.text = text } @@ -71,6 +74,7 @@ class VideoView: UIView { statsLabel.layer.shadowOffset = CGSize(width: 1, height: 1) statsLabel.layer.shadowRadius = 1.0 statsLabel.layer.shadowOpacity = 0.7 + statsLabel.preferredMaxLayoutWidth = frame.width * 0.7 } } @@ -83,11 +87,11 @@ class MetalVideoView: UIView { super.awakeFromNib() } - func setPlaceholder(text:String) { + func setPlaceholder(text: String) { placeholder.text = text } - func setInfo(text:String) { + func setInfo(text: String) { infolabel.text = text } } @@ -101,11 +105,11 @@ class SampleBufferDisplayView: UIView { super.awakeFromNib() } - func setPlaceholder(text:String) { + func setPlaceholder(text: String) { placeholder.text = text } - func setInfo(text:String) { + func setInfo(text: String) { infolabel.text = text } } diff --git a/iOS/APIExample/APIExample/Examples/Advanced/ARKit/ARKit.swift b/iOS/APIExample/APIExample/Examples/Advanced/ARKit/ARKit.swift index 778814d5a..d1ae730ae 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/ARKit/ARKit.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/ARKit/ARKit.swift @@ -9,8 +9,7 @@ import UIKit import AgoraRtcKit import ARKit -class ARKitEntry : UIViewController -{ +class ARKitEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! @IBOutlet weak var firstFrameSwitch: UISwitch! @@ -23,11 +22,16 @@ class ARKitEntry : UIViewController @IBAction func doOptimizeFirstFrameSwitch(_ sender: UISwitch) { if sender.isOn { - let alertVC = UIAlertController(title: "After this function is enabled, it cannot be disabled and takes effect only when both the primary and secondary ends are enabled".localized, + // swiftlint:disable line_length + let title = "After this function is enabled, it cannot be disabled and takes effect only when both the primary and secondary ends are enabled".localized + // swiftlint:enable line_length + let alertVC = UIAlertController(title: title, message: nil, preferredStyle: .alert) - let ok = UIAlertAction(title: "Sure".localized, style: .default, handler: nil) + let ok = UIAlertAction(title: "Sure".localized, + style: .default, + handler: nil) let cancel = UIAlertAction(title: "Cancel".localized, style: .cancel) { _ in sender.isOn = false } @@ -39,14 +43,14 @@ class ARKitEntry : UIViewController @IBAction func doJoinPressed(sender: AGButton) { guard let channelName = channelTextField.text else {return} - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} newViewController.title = channelName - newViewController.configs = ["channelName":channelName, "isFirstFrame": firstFrameSwitch.isOn] + newViewController.configs = ["channelName": channelName, "isFirstFrame": firstFrameSwitch.isOn] navigationController?.pushViewController(newViewController, animated: true) } } @@ -66,7 +70,7 @@ class ARKitMain: BaseViewController { var isJoined: Bool = false var planarDetected: Bool = false { didSet { - if(planarDetected) { + if planarDetected { infoLabel.text = "Tap to place remote video canvas".localized } else { infoLabel.text = "Move Camera to find a planar\n(Shown as Red Rectangle)".localized @@ -77,7 +81,7 @@ class ARKitMain: BaseViewController { override func viewDidLoad() { super.viewDidLoad() - //set AR Scene delegate + // set AR Scene delegate sceneView.delegate = self sceneView.session.delegate = self sceneView.showsStatistics = true @@ -111,7 +115,6 @@ class ARKitMain: BaseViewController { // start AR Session startARSession() - // Set audio route to speaker agoraKit.setDefaultAudioRouteToSpeakerphone(true) @@ -127,7 +130,7 @@ class ARKitMain: BaseViewController { // Usually happens with invalid parameters // Error code description can be found at: // en: https://docs.agora.io/en/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } } @@ -136,8 +139,9 @@ class ARKitMain: BaseViewController { // start AR World tracking func startARSession() { guard ARWorldTrackingConfiguration.isSupported else { + let message = "This app requires world tracking, which is available only on iOS devices with the A9 processor or later.".localized showAlert(title: "ARKit is not available on this device.".localized, - message: "This app requires world tracking, which is available only on iOS devices with the A9 processor or later.".localized) + message: message) return } @@ -156,7 +160,7 @@ class ARKitMain: BaseViewController { } @IBAction func doSceneViewTapped(_ recognizer: UITapGestureRecognizer) { - if(!planarDetected) { + if !planarDetected { LogUtils.log(message: "Planar not yet found", level: .warning) return } @@ -188,23 +192,21 @@ private extension ARKitMain { renderer.renderNode = node activeScreens[uid] = node } - func addNode(withTransform transform: matrix_float4x4) { - let scene = SCNScene(named: "AR.scnassets/displayer.scn")! + guard let scene = SCNScene(named: "AR.scnassets/displayer.scn") else { return } let rootNode = scene.rootNode - rootNode.position = SCNVector3( transform.columns.3.x, transform.columns.3.y, transform.columns.3.z ) - rootNode.rotation = SCNVector4(0, 1, 0, sceneView.session.currentFrame!.camera.eulerAngles.y) - + guard let currentFrame = sceneView.session.currentFrame else { return } + rootNode.rotation = SCNVector4(0, 1, 0, currentFrame.camera.eulerAngles.y) sceneView.scene.rootNode.addChildNode(rootNode) - - let displayer = rootNode.childNode(withName: "displayer", recursively: false)! - let screen = displayer.childNode(withName: "screen", recursively: false)! - + let displayer = rootNode.childNode(withName: "displayer", + recursively: false) + guard let screen = displayer?.childNode(withName: "screen", + recursively: false) else { return } if let undisplayedUid = undisplayedUsers.first { undisplayedUsers.removeFirst() renderRemoteUser(uid: undisplayedUid, toNode: screen) @@ -212,28 +214,23 @@ private extension ARKitMain { unusedScreenNodes.append(screen) } } - func removeNode(_ node: SCNNode) { let rootNode: SCNNode let screen: SCNNode - if node.name == "screen", let parent = node.parent?.parent { rootNode = parent screen = node } else if node.name == "displayer", let parent = node.parent { rootNode = parent - screen = parent.childNode(withName: "screen", recursively: false)! + screen = parent.childNode(withName: "screen", recursively: false) ?? SCNNode() } else { rootNode = node screen = node } - rootNode.removeFromParentNode() - if let index = unusedScreenNodes.firstIndex(where: {$0 == screen}) { unusedScreenNodes.remove(at: index) } - if let (uid, _) = activeScreens.first(where: {$1 == screen}) { activeScreens.removeValue(forKey: uid) if let screenNode = unusedScreenNodes.first { @@ -262,7 +259,7 @@ extension ARKitMain: AgoraRtcEngineDelegate { /// to let user know something wrong is happening /// Error code description can be found at: /// en: https://docs.agora.io/en/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode.rawValue)", level: .error) @@ -308,13 +305,15 @@ extension ARKitMain: AgoraRtcEngineDelegate { } } - func rtcEngine(_ engine: AgoraRtcEngineKit, videoRenderingTracingResultOfUid uid: UInt, currentEvent: AgoraMediaTraceEvent, tracingInfo: AgoraVideoRenderingTracingInfo) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + videoRenderingTracingResultOfUid uid: UInt, + currentEvent: AgoraMediaTraceEvent, + tracingInfo: AgoraVideoRenderingTracingInfo) { statsLabel.isHidden = tracingInfo.elapsedTime <= 0 statsLabel.text = "firstFrameTime: \(tracingInfo.elapsedTime)" } } - extension ARKitMain: ARSCNViewDelegate { func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { guard let planeAnchor = anchor as? ARPlaneAnchor else { @@ -331,8 +330,8 @@ extension ARKitMain: ARSCNViewDelegate { node.addChildNode(planeNode) planeNode.runAction(SCNAction.fadeOut(duration: 3)) - //found planar - if(!planarDetected) { + // found planar + if !planarDetected { DispatchQueue.main.async {[weak self] in guard let weakSelf = self else { return diff --git a/iOS/APIExample/APIExample/Examples/Advanced/AudioMixing/AudioMixing.swift b/iOS/APIExample/APIExample/Examples/Advanced/AudioMixing/AudioMixing.swift index de67e3efd..4c1ef8e92 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/AudioMixing/AudioMixing.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/AudioMixing/AudioMixing.swift @@ -10,16 +10,15 @@ import UIKit import AgoraRtcKit import AGEVideoLayout -let EFFECT_ID:Int32 = 1 +let EFFECT_ID: Int32 = 1 -class AudioMixingEntry : UIViewController -{ +class AudioMixingEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! @IBOutlet weak var scenarioBtn: UIButton! @IBOutlet weak var profileBtn: UIButton! - var profile:AgoraAudioProfile = .default - var scenario:AgoraAudioScenario = .default + var profile: AgoraAudioProfile = .default + var scenario: AgoraAudioScenario = .default let identifier = "AudioMixing" override func viewDidLoad() { @@ -30,44 +29,58 @@ class AudioMixingEntry : UIViewController } @IBAction func doJoinPressed(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName, "audioProfile":profile, "audioScenario":scenario] + newViewController.configs = ["channelName": channelName, + "audioProfile": profile, + "audioScenario": scenario] navigationController?.pushViewController(newViewController, animated: true) } - func getAudioProfileAction(_ profile:AgoraAudioProfile) -> UIAlertAction{ - return UIAlertAction(title: "\(profile.description())", style: .default, handler: {[unowned self] action in + func getAudioProfileAction(_ profile: AgoraAudioProfile) -> UIAlertAction { + return UIAlertAction(title: "\(profile.description())", + style: .default, + handler: { [unowned self] _ in self.profile = profile self.profileBtn.setTitle("\(profile.description())", for: .normal) }) } - func getAudioScenarioAction(_ scenario:AgoraAudioScenario) -> UIAlertAction{ - return UIAlertAction(title: "\(scenario.description())", style: .default, handler: {[unowned self] action in + func getAudioScenarioAction(_ scenario: AgoraAudioScenario) -> UIAlertAction { + return UIAlertAction(title: "\(scenario.description())", + style: .default, + handler: { [unowned self] _ in self.scenario = scenario self.scenarioBtn.setTitle("\(scenario.description())", for: .normal) }) } - @IBAction func setAudioProfile(){ - let alert = UIAlertController(title: "Set Audio Profile".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) - for profile in AgoraAudioProfile.allValues(){ + @IBAction func setAudioProfile() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Audio Profile".localized, + message: nil, + preferredStyle: style) + for profile in AgoraAudioProfile.allValues() { alert.addAction(getAudioProfileAction(profile)) } alert.addCancelAction() present(alert, animated: true, completion: nil) } - @IBAction func setAudioScenario(){ - let alert = UIAlertController(title: "Set Audio Scenario".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) - for scenario in AgoraAudioScenario.allValues(){ + @IBAction func setAudioScenario() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Audio Scenario".localized, + message: nil, + preferredStyle: style) + for scenario in AgoraAudioScenario.allValues() { alert.addAction(getAudioScenarioAction(scenario)) } alert.addCancelAction() @@ -84,19 +97,21 @@ class AudioMixingMain: BaseViewController { @IBOutlet weak var audioMixingProgressView: UIProgressView! @IBOutlet weak var audioMixingDuration: UILabel! @IBOutlet weak var audioEffectVolumeSlider: UISlider! - var audioViews: [UInt:VideoView] = [:] - var timer:Timer? + var audioViews: [UInt: VideoView] = [:] + var timer: Timer? // indicate if current instance has joined channel var isJoined: Bool = false - override func viewDidLoad(){ + override func viewDidLoad() { super.viewDidLoad() guard let channelName = configs["channelName"] as? String, let audioProfile = configs["audioProfile"] as? AgoraAudioProfile, let audioScenario = configs["audioScenario"] as? AgoraAudioScenario - else {return} + else { + return + } // set up agora instance when view loaded let config = AgoraRtcEngineConfig() @@ -131,7 +146,6 @@ class AudioMixingMain: BaseViewController { // enable volume indicator agoraKit.enableAudioVolumeIndication(200, smooth: 3, reportVad: false) - // start joining channel // 1. Users can only see each other after they join the // same channel successfully using the same app id. @@ -139,11 +153,14 @@ class AudioMixingMain: BaseViewController { // when joining channel. The channel name and uid used to calculate // the token has to match the ones used for channel join NetworkManager.shared.generateToken(channelName: channelName, success: { token in - let result = self.agoraKit.joinChannel(byToken: token, channelId: channelName, info: nil, uid: 0) {[unowned self] (channel, uid, elapsed) -> Void in + let result = self.agoraKit.joinChannel(byToken: token, + channelId: channelName, + info: nil, + uid: 0) {[unowned self] (channel, uid, elapsed) -> Void in self.isJoined = true LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) - //set up local audio view, this view will not show video but just a placeholder + // set up local audio view, this view will not show video but just a placeholder let view = Bundle.loadVideoView(type: .local, audioOnly: true) self.audioViews[0] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: true)) @@ -152,8 +169,8 @@ class AudioMixingMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -176,31 +193,31 @@ class AudioMixingMain: BaseViewController { return Array(audioViews.values).sorted(by: { $0.uid < $1.uid }) } - @IBAction func onChangeAudioMixingVolume(_ sender:UISlider){ - let value:Int = Int(sender.value) + @IBAction func onChangeAudioMixingVolume(_ sender: UISlider) { + let value = Int(sender.value) print("adjustAudioMixingVolume \(value)") agoraKit.adjustAudioMixingVolume(value) } - @IBAction func onChangeAudioMixingPlaybackVolume(_ sender:UISlider){ - let value:Int = Int(sender.value) + @IBAction func onChangeAudioMixingPlaybackVolume(_ sender: UISlider) { + let value = Int(sender.value) print("adjustAudioMixingPlayoutVolume \(value)") agoraKit.adjustAudioMixingPlayoutVolume(value) } - @IBAction func onChangeAudioMixingPublishVolume(_ sender:UISlider){ - let value:Int = Int(sender.value) + @IBAction func onChangeAudioMixingPublishVolume(_ sender: UISlider) { + let value = Int(sender.value) print("adjustAudioMixingPublishVolume \(value)") agoraKit.adjustAudioMixingPublishVolume(value) } - @IBAction func onChangeAudioEffectVolume(_ sender:UISlider){ - let value:Int = Int(sender.value) + @IBAction func onChangeAudioEffectVolume(_ sender: UISlider) { + let value = Int(sender.value) print("setEffectsVolume \(value)") agoraKit.setEffectsVolume(value) } - @IBAction func onStartAudioMixing(_ sender:UIButton){ + @IBAction func onStartAudioMixing(_ sender: UIButton) { if let filepath = Bundle.main.path(forResource: "audiomixing", ofType: "mp3") { let result = agoraKit.startAudioMixing(filepath, loopback: false, cycle: -1) if result != 0 { @@ -209,7 +226,7 @@ class AudioMixingMain: BaseViewController { } } - @IBAction func onStopAudioMixing(_ sender:UIButton){ + @IBAction func onStopAudioMixing(_ sender: UIButton) { let result = agoraKit.stopAudioMixing() if result != 0 { self.showAlert(title: "Error", message: "stopAudioMixing call failed: \(result), please check your params") @@ -219,7 +236,7 @@ class AudioMixingMain: BaseViewController { } } - @IBAction func onPauseAudioMixing(_ sender:UIButton){ + @IBAction func onPauseAudioMixing(_ sender: UIButton) { let result = agoraKit.pauseAudioMixing() if result != 0 { self.showAlert(title: "Error", message: "pauseAudioMixing call failed: \(result), please check your params") @@ -228,7 +245,7 @@ class AudioMixingMain: BaseViewController { } } - @IBAction func onResumeAudioMixing(_ sender:UIButton){ + @IBAction func onResumeAudioMixing(_ sender: UIButton) { let result = agoraKit.resumeAudioMixing() if result != 0 { self.showAlert(title: "Error", message: "resumeAudioMixing call failed: \(result), please check your params") @@ -239,8 +256,8 @@ class AudioMixingMain: BaseViewController { func startProgressTimer() { // begin timer to update progress - if(timer == nil) { - timer = Timer.scheduledTimer(withTimeInterval: 0.5, repeats: true, block: { [weak self](timer:Timer) in + if timer == nil { + timer = Timer.scheduledTimer(withTimeInterval: 0.5, repeats: true, block: { [weak self] _ in guard let weakself = self else {return} let progress = Float(weakself.agoraKit.getAudioMixingCurrentPosition()) / Float(weakself.agoraKit.getAudioMixingDuration()) weakself.audioMixingProgressView.setProgress(progress, animated: true) @@ -250,14 +267,14 @@ class AudioMixingMain: BaseViewController { func stopProgressTimer() { // stop timer - if(timer != nil) { + if timer != nil { timer?.invalidate() timer = nil } } - func updateTotalDuration(reset:Bool) { - if(reset) { + func updateTotalDuration(reset: Bool) { + if reset { audioMixingDuration.text = "00 : 00" } else { let duration = agoraKit.getAudioMixingDuration() @@ -266,30 +283,36 @@ class AudioMixingMain: BaseViewController { } } - @IBAction func onPlayEffect(_ sender:UIButton){ + @IBAction func onPlayEffect(_ sender: UIButton) { if let filepath = Bundle.main.path(forResource: "audioeffect", ofType: "mp3") { - let result = agoraKit.playEffect(EFFECT_ID, filePath: filepath, loopCount: -1, pitch: 1, pan: 0, gain: 100, publish: true) + let result = agoraKit.playEffect(EFFECT_ID, + filePath: + filepath, + loopCount: -1, + pitch: 1, pan: 0, + gain: 100, + publish: true) if result != 0 { self.showAlert(title: "Error", message: "playEffect call failed: \(result), please check your params") } } } - @IBAction func onStopEffect(_ sender:UIButton){ + @IBAction func onStopEffect(_ sender: UIButton) { let result = agoraKit.stopEffect(EFFECT_ID) if result != 0 { self.showAlert(title: "Error", message: "stopEffect call failed: \(result), please check your params") } } - @IBAction func onPauseEffect(_ sender:UIButton){ + @IBAction func onPauseEffect(_ sender: UIButton) { let result = agoraKit.pauseEffect(EFFECT_ID) if result != 0 { self.showAlert(title: "Error", message: "pauseEffect call failed: \(result), please check your params") } } - @IBAction func onResumeEffect(_ sender:UIButton){ + @IBAction func onResumeEffect(_ sender: UIButton) { let result = agoraKit.resumeEffect(EFFECT_ID) if result != 0 { self.showAlert(title: "Error", message: "resumeEffect call failed: \(result), please check your params") @@ -312,8 +335,8 @@ extension AudioMixingMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -331,7 +354,7 @@ extension AudioMixingMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { LogUtils.log(message: "remote user join: \(uid) \(elapsed)ms", level: .info) - //set up remote audio view, this view will not show video but just a placeholder + // set up remote audio view, this view will not show video but just a placeholder let view = Bundle.loadVideoView(type: .remote, audioOnly: true) view.uid = uid self.audioViews[uid] = view @@ -347,7 +370,7 @@ extension AudioMixingMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) - //remove remote audio view + // remove remote audio view self.audioViews.removeValue(forKey: uid) self.container.layoutStream2x1(views: sortedViews()) self.container.reload(level: 0, animated: true) @@ -357,11 +380,6 @@ extension AudioMixingMain: AgoraRtcEngineDelegate { /// @params speakers volume info for all speakers /// @params totalVolume Total volume after audio mixing. The value range is [0,255]. func rtcEngine(_ engine: AgoraRtcEngineKit, reportAudioVolumeIndicationOfSpeakers speakers: [AgoraRtcAudioVolumeInfo], totalVolume: Int) { -// for volumeInfo in speakers { -// if let audioView = audioViews[volumeInfo.uid] { -// audioView.setInfo(text: "Volume:\(volumeInfo.volume)") -// } -// } } /// Reports the statistics of the current call. The SDK triggers this callback once every two seconds after the user joins the channel. @@ -381,7 +399,9 @@ extension AudioMixingMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, remoteAudioStats stats: AgoraRtcRemoteAudioStats) { audioViews[stats.uid]?.statsInfo?.updateAudioStats(stats) } - func rtcEngine(_ engine: AgoraRtcEngineKit, audioMixingStateChanged state: AgoraAudioMixingStateType, reasonCode: AgoraAudioMixingReasonCode) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + audioMixingStateChanged state: AgoraAudioMixingStateType, + reasonCode: AgoraAudioMixingReasonCode) { LogUtils.log(message: "audioMixingStateChanged \(state.rawValue), code: \(reasonCode.rawValue)", level: .info) if state == .playing { startProgressTimer() diff --git a/iOS/APIExample/APIExample/Examples/Advanced/AudioWaveform/AudioWaveform.swift b/iOS/APIExample/APIExample/Examples/Advanced/AudioWaveform/AudioWaveform.swift index ec82a48d6..64c89c0cc 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/AudioWaveform/AudioWaveform.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/AudioWaveform/AudioWaveform.swift @@ -10,14 +10,13 @@ import UIKit import AgoraRtcKit import AGEVideoLayout -class AudioWaveformEntry : UIViewController -{ +class AudioWaveformEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! @IBOutlet weak var scenarioBtn: UIButton! @IBOutlet weak var profileBtn: UIButton! - var profile:AgoraAudioProfile = .default - var scenario:AgoraAudioScenario = .default + var profile: AgoraAudioProfile = .default + var scenario: AgoraAudioScenario = .default let identifier = "AudioWaveform" override func viewDidLoad() { @@ -29,43 +28,53 @@ class AudioWaveformEntry : UIViewController @IBAction func doJoinPressed(sender: AGButton) { guard let channelName = channelTextField.text else {return} - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName, "audioProfile":profile, "audioScenario":scenario] + newViewController.configs = ["channelName": channelName, + "audioProfile": profile, + "audioScenario": scenario] navigationController?.pushViewController(newViewController, animated: true) } - func getAudioProfileAction(_ profile:AgoraAudioProfile) -> UIAlertAction{ - return UIAlertAction(title: "\(profile.description())", style: .default, handler: {[unowned self] action in + func getAudioProfileAction(_ profile: AgoraAudioProfile) -> UIAlertAction { + return UIAlertAction(title: "\(profile.description())", + style: .default, + handler: { [unowned self] _ in self.profile = profile self.profileBtn.setTitle("\(profile.description())", for: .normal) }) } - func getAudioScenarioAction(_ scenario:AgoraAudioScenario) -> UIAlertAction{ - return UIAlertAction(title: "\(scenario.description())", style: .default, handler: {[unowned self] action in + func getAudioScenarioAction(_ scenario: AgoraAudioScenario) -> UIAlertAction { + return UIAlertAction(title: "\(scenario.description())", + style: .default, + handler: { [unowned self] _ in self.scenario = scenario self.scenarioBtn.setTitle("\(scenario.description())", for: .normal) }) } - @IBAction func setAudioProfile(){ - let alert = UIAlertController(title: "Set Audio Profile".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) - for profile in AgoraAudioProfile.allValues(){ + @IBAction func setAudioProfile() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Audio Profile".localized, message: nil, preferredStyle: style) + for profile in AgoraAudioProfile.allValues() { alert.addAction(getAudioProfileAction(profile)) } alert.addCancelAction() present(alert, animated: true, completion: nil) } - @IBAction func setAudioScenario(){ - let alert = UIAlertController(title: "Set Audio Scenario".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) - for scenario in AgoraAudioScenario.allValues(){ + @IBAction func setAudioScenario() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Audio Scenario".localized, message: nil, preferredStyle: style) + for scenario in AgoraAudioScenario.allValues() { alert.addAction(getAudioScenarioAction(scenario)) } alert.addCancelAction() @@ -78,12 +87,12 @@ class AudioWaveformMain: BaseViewController { @IBOutlet weak var container: AGEVideoContainer! @IBOutlet weak var boxingview: ZSNBoxingView! - var audioViews: [UInt:VideoView] = [:] + var audioViews: [UInt: VideoView] = [:] // indicate if current instance has joined channel var isJoined: Bool = false - override func viewDidLoad(){ + override func viewDidLoad() { super.viewDidLoad() guard let channelName = configs["channelName"] as? String, @@ -114,7 +123,8 @@ class AudioWaveformMain: BaseViewController { let resolution = (GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize) ?? .zero let fps = (GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate) ?? .fps15 - let orientation = (GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait + let orientation = (GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, frameRate: fps, bitrate: AgoraVideoBitrateStandard, @@ -146,8 +156,8 @@ class AudioWaveformMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -194,8 +204,8 @@ extension AudioWaveformMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -206,7 +216,7 @@ extension AudioWaveformMain: AgoraRtcEngineDelegate { self.isJoined = true LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) - //set up local audio view, this view will not show video but just a placeholder + // set up local audio view, this view will not show video but just a placeholder let view = Bundle.loadVideoView(type: .local, audioOnly: true) self.audioViews[0] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: true)) @@ -219,7 +229,7 @@ extension AudioWaveformMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { LogUtils.log(message: "remote user join: \(uid) \(elapsed)ms", level: .info) - //set up remote audio view, this view will not show video but just a placeholder + // set up remote audio view, this view will not show video but just a placeholder let view = Bundle.loadVideoView(type: .remote, audioOnly: true) view.uid = uid self.audioViews[uid] = view @@ -235,7 +245,7 @@ extension AudioWaveformMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) - //remove remote audio view + // remove remote audio view self.audioViews.removeValue(forKey: uid) self.container.layoutStream3x2(views: sortedViews()) self.container.reload(level: 0, animated: true) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/AuidoRouterPlayer/AuidoRouterPlayer.swift b/iOS/APIExample/APIExample/Examples/Advanced/AuidoRouterPlayer/AuidoRouterPlayer.swift index 7ebce97d4..bcb439b8e 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/AuidoRouterPlayer/AuidoRouterPlayer.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/AuidoRouterPlayer/AuidoRouterPlayer.swift @@ -15,8 +15,7 @@ enum ThirdPlayerType: String { case origin = "avplayer" } -class AuidoRouterPlayerEntry : UIViewController -{ +class AuidoRouterPlayerEntry: UIViewController { @IBOutlet weak var joinButton: UIButton! @IBOutlet weak var channelTextField: UITextField! let identifier = "AuidoRouterPlayer" @@ -24,53 +23,54 @@ class AuidoRouterPlayerEntry : UIViewController @IBOutlet var fpsBtn: UIButton! @IBOutlet var orientationBtn: UIButton! @IBOutlet weak var chosePlayerButton: UIButton! - var width:Int = 960, height:Int = 540, orientation:AgoraVideoOutputOrientationMode = .adaptative, fps = 15 + var width: Int = 960, height: Int = 540, orientation: AgoraVideoOutputOrientationMode = .adaptative, fps = 15 private var playerType: ThirdPlayerType = .ijk override func viewDidLoad() { super.viewDidLoad() } - @IBAction func onChosePlayerType(_ sender: UIButton) { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet let alert = UIAlertController(title: "Player Type(ijkplayer/avplayer)".localized, message: nil, - preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) + preferredStyle: style) alert.addAction(getPlayerAction(ThirdPlayerType.ijk.rawValue)) alert.addAction(getPlayerAction(ThirdPlayerType.origin.rawValue)) alert.addCancelAction() present(alert, animated: true, completion: nil) } - func getPlayerAction(_ title: String) -> UIAlertAction{ - return UIAlertAction(title: title, style: .default, handler: {[unowned self] action in + func getPlayerAction(_ title: String) -> UIAlertAction { + return UIAlertAction(title: title, style: .default, handler: { [unowned self] _ in self.chosePlayerButton.setTitle(title, for: .normal) self.playerType = ThirdPlayerType(rawValue: title) ?? .ijk }) } - func getResolutionAction(width:Int, height:Int) -> UIAlertAction{ - return UIAlertAction(title: "\(width)x\(height)", style: .default, handler: {[unowned self] action in + func getResolutionAction(width: Int, height: Int) -> UIAlertAction { + return UIAlertAction(title: "\(width)x\(height)", style: .default, handler: { [unowned self] _ in self.width = width self.height = height self.resolutionBtn.setTitle("\(width)x\(height)", for: .normal) }) } - func getFpsAction(_ fps:Int) -> UIAlertAction{ - return UIAlertAction(title: "\(fps)fps", style: .default, handler: {[unowned self] action in + func getFpsAction(_ fps: Int) -> UIAlertAction { + return UIAlertAction(title: "\(fps)fps", style: .default, handler: { [unowned self] _ in self.fps = fps self.fpsBtn.setTitle("\(fps)fps", for: .normal) }) } - func getOrientationAction(_ orientation:AgoraVideoOutputOrientationMode) -> UIAlertAction{ - return UIAlertAction(title: "\(orientation.description())", style: .default, handler: {[unowned self] action in + func getOrientationAction(_ orientation: AgoraVideoOutputOrientationMode) -> UIAlertAction { + return UIAlertAction(title: "\(orientation.description())", style: .default, handler: { [unowned self] _ in self.orientation = orientation self.orientationBtn.setTitle("\(orientation.description())", for: .normal) }) } - @IBAction func setResolution(){ - let alert = UIAlertController(title: "Set Resolution".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) + @IBAction func setResolution() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Resolution".localized, message: nil, preferredStyle: style) alert.addAction(getResolutionAction(width: 90, height: 90)) alert.addAction(getResolutionAction(width: 160, height: 120)) alert.addAction(getResolutionAction(width: 320, height: 240)) @@ -80,8 +80,9 @@ class AuidoRouterPlayerEntry : UIViewController present(alert, animated: true, completion: nil) } - @IBAction func setFps(){ - let alert = UIAlertController(title: "Set Fps".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) + @IBAction func setFps() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Fps".localized, message: nil, preferredStyle: style) alert.addAction(getFpsAction(10)) alert.addAction(getFpsAction(15)) alert.addAction(getFpsAction(24)) @@ -91,8 +92,9 @@ class AuidoRouterPlayerEntry : UIViewController present(alert, animated: true, completion: nil) } - @IBAction func setOrientation(){ - let alert = UIAlertController(title: "Set Orientation".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) + @IBAction func setOrientation() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Orientation".localized, message: nil, preferredStyle: style) alert.addAction(getOrientationAction(.adaptative)) alert.addAction(getOrientationAction(.fixedLandscape)) alert.addAction(getOrientationAction(.fixedPortrait)) @@ -101,16 +103,17 @@ class AuidoRouterPlayerEntry : UIViewController } @IBAction func doJoinPressed(sender: UIButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName, - "resolution":CGSize(width: width, height: height), + newViewController.configs = ["channelName": channelName, + "resolution": CGSize(width: width, height: height), "fps": fps, "orientation": orientation, "playerType": playerType.rawValue] @@ -219,8 +222,8 @@ class AuidoRouterPlayerMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -228,7 +231,7 @@ class AuidoRouterPlayerMain: BaseViewController { override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) - let playerType = ThirdPlayerType(rawValue: configs["playerType"] as! String) + let playerType = ThirdPlayerType(rawValue: configs["playerType"] as? String ?? "") if playerType == .ijk { setupIJKPlayer() } else { @@ -261,7 +264,7 @@ class AuidoRouterPlayerMain: BaseViewController { } } AgoraRtcEngineKit.destroy() - let playerType = ThirdPlayerType(rawValue: configs["playerType"] as! String) + let playerType = ThirdPlayerType(rawValue: configs["playerType"] as? String ?? "") if playerType == .origin { avPlayer?.player?.pause() } else { @@ -271,9 +274,10 @@ class AuidoRouterPlayerMain: BaseViewController { } extension AuidoRouterPlayerMain: AVPlayerViewControllerDelegate { - func playerViewController(_ playerViewController: AVPlayerViewController, willEndFullScreenPresentationWithAnimationCoordinator coordinator: UIViewControllerTransitionCoordinator) { + func playerViewController(_ playerViewController: AVPlayerViewController, + willEndFullScreenPresentationWithAnimationCoordinator coordinator: UIViewControllerTransitionCoordinator) { // The system pauses when returning from full screen, we need to 'resume' manually. - coordinator.animate(alongsideTransition: nil) { transitionContext in + coordinator.animate(alongsideTransition: nil) { _ in playerViewController.player?.play() } } @@ -294,8 +298,8 @@ extension AuidoRouterPlayerMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -307,7 +311,8 @@ extension AuidoRouterPlayerMain: AgoraRtcEngineDelegate { LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) } - /// callback when a remote user is joinning the channel, note audience in live broadcast mode will NOT trigger this event + /// callback when a remote user is joinning the channel, + /// note audience in live broadcast mode will NOT trigger this event /// @param uid uid of remote joined user /// @param elapsed time elapse since current sdk instance join the channel in ms func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { @@ -324,7 +329,8 @@ extension AuidoRouterPlayerMain: AgoraRtcEngineDelegate { agoraKit.setupRemoteVideo(videoCanvas) } - /// callback when a remote user is leaving the channel, note audience in live broadcast mode will NOT trigger this event + /// callback when a remote user is leaving the channel, + /// note audience in live broadcast mode will NOT trigger this event /// @param uid uid of remote joined user /// @param reason reason why this user left, note this event may be triggered when the remote user /// become an audience in live broadcasting profile @@ -342,7 +348,8 @@ extension AuidoRouterPlayerMain: AgoraRtcEngineDelegate { agoraKit.setupRemoteVideo(videoCanvas) } - /// Reports the statistics of the current call. The SDK triggers this callback once every two seconds after the user joins the channel. + /// Reports the statistics of the current call. + /// The SDK triggers this callback once every two seconds after the user joins the channel. /// @param stats stats struct func rtcEngine(_ engine: AgoraRtcEngineKit, reportRtcStats stats: AgoraChannelStats) { localVideo.statsInfo?.updateChannelStats(stats) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/ContentInspect/ContentInspect.swift b/iOS/APIExample/APIExample/Examples/Advanced/ContentInspect/ContentInspect.swift index f8b98879f..f312cce45 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/ContentInspect/ContentInspect.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/ContentInspect/ContentInspect.swift @@ -6,7 +6,6 @@ // Copyright 漏 2022 Agora Corp. All rights reserved. // - /// Content Inspect /// This module show how to use sdk ability to inspect sexy video content. /// 1.Enable content inspect: agoraKit.enableContentInspect(true, config:inspectConfig). @@ -14,7 +13,6 @@ /// /// More detail: https://docs.agora.io/cn/content-moderation/landing-page?platform=iOS - import AgoraRtcKit class ContentInspectViewController: BaseViewController { @@ -23,7 +21,6 @@ class ContentInspectViewController: BaseViewController { var agoraKit: AgoraRtcEngineKit! - // MARK: - LifeCycle override func viewDidLoad() { super.viewDidLoad() @@ -50,7 +47,7 @@ class ContentInspectViewController: BaseViewController { let inspectConfig = AgoraContentInspectConfig() inspectConfig.modules = [moderateModule] - agoraKit.enableContentInspect(true, config:inspectConfig) + agoraKit.enableContentInspect(true, config: inspectConfig) let options = AgoraRtcChannelMediaOptions() options.publishCameraTrack = true @@ -61,8 +58,8 @@ class ContentInspectViewController: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "Join channel failed with errorCode: \(result)") } }) @@ -79,7 +76,10 @@ class ContentInspectViewController: BaseViewController { // MARK: - UI func setupUI () { - let rightBarButton = UIBarButtonItem(title: "SwitchCamera".localized, style: .plain, target: self, action: #selector(switchCameraBtnClicked)) + let rightBarButton = UIBarButtonItem(title: "SwitchCamera".localized, + style: .plain, + target: self, + action: #selector(switchCameraBtnClicked)) self.navigationItem.rightBarButtonItem = rightBarButton } @@ -93,8 +93,8 @@ extension ContentInspectViewController: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code LogUtils.log(message: "Error occur: \(errorCode)", level: .error) showAlert(title: "Error", message: "Error: \(errorCode.description)") } @@ -104,7 +104,6 @@ extension ContentInspectViewController: AgoraRtcEngineDelegate { } } - class ContentInspectEntryViewController: UIViewController { @IBOutlet weak var channelTextField: UITextField! diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CreateDataStream/CreateDataStream.swift b/iOS/APIExample/APIExample/Examples/Advanced/CreateDataStream/CreateDataStream.swift index dc5bc7bca..ee6370382 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CreateDataStream/CreateDataStream.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CreateDataStream/CreateDataStream.swift @@ -21,12 +21,13 @@ class CreateDataStreamEntry: UIViewController { @IBAction func doJoinPressed(sender: UIButton) { guard let channelName = channelTextField.text else { return } - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { return + } newViewController.title = channelName newViewController.configs = ["channelName": channelName] navigationController?.pushViewController(newViewController, animated: true) @@ -74,8 +75,8 @@ class CreateDataStreamMain: BaseViewController { guard let channelName = configs["channelName"] as? String, let resolution = GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize, let fps = GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate, - let orientation = GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode else {return} - + let orientation = GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode else { return } // make myself a broadcaster agoraKit.setClientRole(GlobalSettings.shared.getUserRole()) @@ -117,8 +118,8 @@ class CreateDataStreamMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -127,13 +128,9 @@ class CreateDataStreamMain: BaseViewController { /// send message @IBAction func onSendPress(_ sender: UIButton) { // indicate if stream has created - var streamCreated = false var streamId: Int = 0 - let message = messageField.text - if message == nil || message!.isEmpty { - return - } + guard let message = messageField.text, !message.isEmpty else { return } // create the data stream // Each user can create up to five data streams during the lifecycle of the agoraKit let config = AgoraDataStreamConfig() @@ -141,10 +138,11 @@ class CreateDataStreamMain: BaseViewController { if result != 0 { showAlert(title: "Error", message: "createDataStream call failed: \(result), please check your params") } - - let sendResult = agoraKit.sendStreamMessage(streamId, data: Data(message!.utf8)) + let sendResult = agoraKit.sendStreamMessage(streamId, + data: Data(message.utf8)) if sendResult != 0 { - showAlert(title: "Error", message: "sendStreamMessage call failed: \(sendResult), please check your params") + showAlert(title: "Error", + message: "sendStreamMessage call failed: \(sendResult), please check your params") } else { messageField.text = nil } @@ -180,8 +178,8 @@ extension CreateDataStreamMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -197,7 +195,8 @@ extension CreateDataStreamMain: AgoraRtcEngineDelegate { LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) } - /// callback when a remote user is joinning the channel, note audience in live broadcast mode will NOT trigger this event + /// callback when a remote user is joinning the channel, + /// note audience in live broadcast mode will NOT trigger this event /// @param uid uid of remote joined user /// @param elapsed time elapse since current sdk instance join the channel in ms func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { @@ -214,7 +213,8 @@ extension CreateDataStreamMain: AgoraRtcEngineDelegate { agoraKit.setupRemoteVideo(videoCanvas) } - /// callback when a remote user is leaving the channel, note audience in live broadcast mode will NOT trigger this event + /// callback when a remote user is leaving the channel, + /// note audience in live broadcast mode will NOT trigger this event /// @param uid uid of remote joined user /// @param reason reason why this user left, note this event may be triggered when the remote user /// become an audience in live broadcasting profile @@ -238,12 +238,22 @@ extension CreateDataStreamMain: AgoraRtcEngineDelegate { showAlert(message: "from: \(uid) message: \(message)") } - func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurStreamMessageErrorFromUid uid: UInt, streamId: Int, error: Int, missed: Int, cached: Int) { - LogUtils.log(message: "didOccurStreamMessageErrorFromUid: \(uid), error \(error), missed \(missed), cached \(cached)", level: .info) + // swiftlint:disable function_parameter_count + func rtcEngine(_ engine: AgoraRtcEngineKit, + didOccurStreamMessageErrorFromUid uid: UInt, + streamId: Int, + error: Int, + missed: Int, + cached: Int) { + // swiftlint:enable function_parameter_count + let message = "streamMessageErrorFromUid: \(uid), error \(error)," + let message1 = "missed \(missed), cached \(cached)" + LogUtils.log(message: message + message1, level: .info) showAlert(message: "didOccurStreamMessageErrorFromUid: \(uid)") } - /// Reports the statistics of the current call. The SDK triggers this callback once every two seconds after the user joins the channel. + /// Reports the statistics of the current call. + /// The SDK triggers this callback once every two seconds after the user joins the channel. /// @param stats stats struct func rtcEngine(_ engine: AgoraRtcEngineKit, reportRtcStats stats: AgoraChannelStats) { localVideo.statsInfo?.updateChannelStats(stats) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift b/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift index ad26e8b66..3f7b4f5d8 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioRender/CustomAudioRender.swift @@ -10,8 +10,7 @@ import Foundation import AgoraRtcKit import AGEVideoLayout -class CustomAudioRenderEntry : UIViewController -{ +class CustomAudioRenderEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "CustomAudioRender" @@ -22,14 +21,16 @@ class CustomAudioRenderEntry : UIViewController @IBAction func doJoinPressed(sender: AGButton) { guard let channelName = channelTextField.text else {return} - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName] + newViewController.configs = ["channelName": channelName] navigationController?.pushViewController(newViewController, animated: true) } } @@ -39,15 +40,15 @@ class CustomAudioRenderMain: BaseViewController { var exAudio: ExternalAudio = ExternalAudio.shared() @IBOutlet weak var container: AGEVideoContainer! - var audioViews: [UInt:VideoView] = [:] + var audioViews: [UInt: VideoView] = [:] // indicate if current instance has joined channel var isJoined: Bool = false - override func viewDidLoad(){ + override func viewDidLoad() { super.viewDidLoad() - let sampleRate:UInt = 44100, channel:UInt = 1 + let sampleRate: UInt = 44100, channel: UInt = 1 // set up agora instance when view loaded let config = AgoraRtcEngineConfig() @@ -70,15 +71,10 @@ class CustomAudioRenderMain: BaseViewController { // Set audio route to speaker agoraKit.setDefaultAudioRouteToSpeakerphone(true) - // important!! this example is using onPlaybackAudioFrame to do custom rendering - // by default the audio output will still be processed by SDK hence below api call is mandatory to disable that behavior -// agoraKit.adjustPlaybackSignalVolume(0) -// agoraKit.setAudioFrameDelegate(self) -// agoraKit.setPlaybackAudioFrameParametersWithSampleRate(Int(sampleRate), channel: Int(channel), mode: .readOnly, samplesPerCall: Int(sampleRate*channel)/100) - exAudio.setupExternalAudio(withAgoraKit: agoraKit, sampleRate: UInt32(sampleRate), channels: UInt32(channel), + trackId: 1, audioCRMode: .sdkCaptureExterRender, ioType: .remoteIO) agoraKit.setParameters("{\"che.audio.external_render\": true}") @@ -100,8 +96,8 @@ class CustomAudioRenderMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -135,8 +131,8 @@ extension CustomAudioRenderMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -147,7 +143,7 @@ extension CustomAudioRenderMain: AgoraRtcEngineDelegate { self.isJoined = true LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) - //set up local audio view, this view will not show video but just a placeholder + // set up local audio view, this view will not show video but just a placeholder let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) self.audioViews[uid] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: true)) @@ -160,7 +156,7 @@ extension CustomAudioRenderMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { LogUtils.log(message: "remote user join: \(uid) \(elapsed)ms", level: .info) - //set up remote audio view, this view will not show video but just a placeholder + // set up remote audio view, this view will not show video but just a placeholder let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) self.audioViews[uid] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: false)) @@ -175,7 +171,7 @@ extension CustomAudioRenderMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) - //remove remote audio view + // remove remote audio view self.audioViews.removeValue(forKey: uid) self.container.layoutStream3x3(views: Array(self.audioViews.values)) self.container.reload(level: 0, animated: true) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift b/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift index 877ab62ac..d9cf8791b 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CustomAudioSource/CustomAudioSource.swift @@ -11,8 +11,7 @@ import AgoraRtcKit import AGEVideoLayout import AVFoundation -class CustomAudioSourceEntry : UIViewController -{ +class CustomAudioSourceEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "CustomAudioSource" @@ -23,12 +22,14 @@ class CustomAudioSourceEntry : UIViewController @IBAction func doJoinPressed(sender: AGButton) { guard let channelName = channelTextField.text else {return} - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName newViewController.configs = ["channelName": channelName] navigationController?.pushViewController(newViewController, animated: true) @@ -39,15 +40,16 @@ class CustomAudioSourceMain: BaseViewController { var agoraKit: AgoraRtcEngineKit! var exAudio: ExternalAudio = ExternalAudio.shared() @IBOutlet weak var container: AGEVideoContainer! - var audioViews: [UInt:VideoView] = [:] + var audioViews: [UInt: VideoView] = [:] // indicate if current instance has joined channel var isJoined: Bool = false + private var trackId: Int32 = 0 - override func viewDidLoad(){ + override func viewDidLoad() { super.viewDidLoad() - let sampleRate:UInt = 44100, channel:UInt = 1 + let sampleRate: UInt = 44100, channel: UInt = 1 // set up agora instance when view loaded let config = AgoraRtcEngineConfig() @@ -72,10 +74,17 @@ class CustomAudioSourceMain: BaseViewController { // Set audio route to speaker agoraKit.setDefaultAudioRouteToSpeakerphone(true) + let audioTrack = AgoraAudioTrackConfig() + audioTrack.enableLocalPlayback = true + trackId = agoraKit.createCustomAudioTrack(.mixable, config: audioTrack) + // setup external audio source - exAudio.setupExternalAudio(withAgoraKit: agoraKit, sampleRate: UInt32(sampleRate), channels: UInt32(channel), audioCRMode: .exterCaptureExterRender, ioType: .remoteIO) - // MIGRATED - agoraKit.setExternalAudioSource(true, sampleRate: Int(sampleRate), channels: Int(channel)) + exAudio.setupExternalAudio(withAgoraKit: agoraKit, + sampleRate: UInt32(sampleRate), + channels: UInt32(channel), + trackId: trackId, + audioCRMode: .exterCaptureExterRender, + ioType: .remoteIO) // start joining channel // 1. Users can only see each other after they join the @@ -86,14 +95,15 @@ class CustomAudioSourceMain: BaseViewController { let option = AgoraRtcChannelMediaOptions() option.publishCameraTrack = false option.publishCustomAudioTrack = true + option.publishCustomAudioTrackId = Int(trackId) option.clientRoleType = GlobalSettings.shared.getUserRole() NetworkManager.shared.generateToken(channelName: channelName, success: { token in let result = self.agoraKit.joinChannel(byToken: token, channelId: channelName, uid: 0, mediaOptions: option) if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -104,6 +114,7 @@ class CustomAudioSourceMain: BaseViewController { // leave channel when exiting the view if isJoined { agoraKit.disableAudio() + agoraKit.destroyCustomAudioTrack(Int(trackId)) exAudio.stopWork() agoraKit.leaveChannel { (stats) -> Void in LogUtils.log(message: "left channel, duration: \(stats.duration)", level: .info) @@ -128,21 +139,24 @@ extension CustomAudioSourceMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) self.showAlert(title: "Error", message: "Error \(errorCode.description) occur") } - func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + didJoinChannel channel: String, + withUid uid: UInt, + elapsed: Int) { self.isJoined = true LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) - let sampleRate:UInt = 44100 + let sampleRate: UInt = 44100 try? AVAudioSession.sharedInstance().setPreferredSampleRate(Double(sampleRate)) self.exAudio.startWork() - //set up local audio view, this view will not show video but just a placeholder + // set up local audio view, this view will not show video but just a placeholder let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) self.audioViews[uid] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: true)) @@ -158,7 +172,7 @@ extension CustomAudioSourceMain: AgoraRtcEngineDelegate { LogUtils.log(message: "Ignore pcm play uid", level: .info) return } - //set up remote audio view, this view will not show video but just a placeholder + // set up remote audio view, this view will not show video but just a placeholder let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) self.audioViews[uid] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: false)) @@ -173,7 +187,7 @@ extension CustomAudioSourceMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) - //remove remote audio view + // remove remote audio view self.audioViews.removeValue(forKey: uid) self.container.layoutStream3x3(views: Array(self.audioViews.values)) self.container.reload(level: 0, animated: true) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift b/iOS/APIExample/APIExample/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift index c7a67743e..55582ffb8 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CustomPcmAudioSource/CustomPcmAudioSource.swift @@ -11,8 +11,7 @@ import AgoraRtcKit import AGEVideoLayout import AVFoundation -class CustomPcmAudioSourceEntry : UIViewController -{ +class CustomPcmAudioSourceEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "CustomPcmAudioSource" @@ -23,7 +22,7 @@ class CustomPcmAudioSourceEntry : UIViewController @IBAction func doJoinPressed(sender: AGButton) { guard let channelName = channelTextField.text else { return } - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) @@ -39,7 +38,7 @@ class CustomPcmAudioSourceMain: BaseViewController { var agoraKit: AgoraRtcEngineKit! var pcmSourcePush: AgoraPcmSourcePush? @IBOutlet weak var container: AGEVideoContainer! - var audioViews: [UInt:VideoView] = [:] + var audioViews: [UInt: VideoView] = [:] @IBOutlet weak var playAudioView: UIView! @IBOutlet weak var pushPcmSwitch: UISwitch! private var trackId: Int32 = 0 @@ -51,8 +50,8 @@ class CustomPcmAudioSourceMain: BaseViewController { } } - let sampleRate:UInt = 44100, channel:UInt = 1, bitPerSample = 16, samples = 441 * 10 - override func viewDidLoad(){ + let sampleRate: UInt = 44100, channel: UInt = 1, bitPerSample = 16, samples = 441 * 10 + override func viewDidLoad() { super.viewDidLoad() // set up agora instance when view loaded @@ -105,8 +104,8 @@ class CustomPcmAudioSourceMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -166,8 +165,8 @@ extension CustomPcmAudioSourceMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -177,7 +176,7 @@ extension CustomPcmAudioSourceMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { self.isJoined = true LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) - //set up local audio view, this view will not show video but just a placeholder + // set up local audio view, this view will not show video but just a placeholder let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) self.audioViews[uid] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: true)) @@ -189,7 +188,7 @@ extension CustomPcmAudioSourceMain: AgoraRtcEngineDelegate { /// @param elapsed time elapse since current sdk instance join the channel in ms func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { LogUtils.log(message: "remote user join: \(uid) \(elapsed)ms", level: .info) - //set up remote audio view, this view will not show video but just a placeholder + // set up remote audio view, this view will not show video but just a placeholder let view = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) self.audioViews[uid] = view view.setPlaceholder(text: self.getAudioLabel(uid: uid, isLocal: false)) @@ -204,10 +203,9 @@ extension CustomPcmAudioSourceMain: AgoraRtcEngineDelegate { func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) - //remove remote audio view + // remove remote audio view self.audioViews.removeValue(forKey: uid) self.container.layoutStream3x3(views: Array(self.audioViews.values)) self.container.reload(level: 0, animated: true) } } - diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoRender/CustomVideoRender.swift b/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoRender/CustomVideoRender.swift index e68f475a4..d497a6c46 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoRender/CustomVideoRender.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoRender/CustomVideoRender.swift @@ -9,8 +9,7 @@ import UIKit import AGEVideoLayout import AgoraRtcKit -class CustomVideoRenderEntry : UIViewController -{ +class CustomVideoRenderEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "CustomVideoRender" @@ -20,15 +19,15 @@ class CustomVideoRenderEntry : UIViewController } @IBAction func doJoinPressed(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} newViewController.title = channelName - newViewController.configs = ["channelName":channelName] + newViewController.configs = ["channelName": channelName] navigationController?.pushViewController(newViewController, animated: true) } } @@ -74,16 +73,14 @@ class CustomVideoRenderMain: BaseViewController { agoraKit.enableAudio() let resolution = (GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize) ?? .zero let fps = (GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate) ?? .fps15 - let orientation = (GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait + let orientation = (GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, frameRate: fps, bitrate: AgoraVideoBitrateStandard, orientationMode: orientation, mirrorMode: .auto)) - - - // Set audio route to speaker agoraKit.setDefaultAudioRouteToSpeakerphone(true) @@ -110,21 +107,16 @@ class CustomVideoRenderMain: BaseViewController { NetworkManager.shared.generateToken(channelName: channelName, success: { token in let result = self.agoraKit.joinChannel(byToken: token, channelId: channelName, uid: 0, mediaOptions: option) - // let result = agoraKit.joinChannel(byToken: nil, channelId: channelName, info: nil, uid: 0) {[unowned self] (channel, uid, elapsed) -> Void in - // self.isJoined = true - // LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) - // } if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) } - override func willMove(toParent parent: UIViewController?) { if parent == nil { // leave channel when exiting the view @@ -154,8 +146,8 @@ extension CustomVideoRenderMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.swift b/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.swift index 05f3f23eb..2679f0521 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePush/CustomVideoSourcePush.swift @@ -10,7 +10,7 @@ import AGEVideoLayout import AgoraRtcKit import AVFoundation -class CustomVideoSourcePreview : UIView { +class CustomVideoSourcePreview: UIView { private var previewLayer: AVCaptureVideoPreviewLayer? func insertCaptureVideoPreviewLayer(previewLayer: AVCaptureVideoPreviewLayer) { @@ -27,8 +27,7 @@ class CustomVideoSourcePreview : UIView { } } -class CustomVideoSourcePushEntry : UIViewController -{ +class CustomVideoSourcePushEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "CustomVideoSourcePush" @@ -38,15 +37,16 @@ class CustomVideoSourcePushEntry : UIViewController } @IBAction func doJoinPressed(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName] + newViewController.configs = ["channelName": channelName] navigationController?.pushViewController(newViewController, animated: true) } } @@ -54,7 +54,7 @@ class CustomVideoSourcePushEntry : UIViewController class CustomVideoSourcePushMain: BaseViewController { var localVideo = Bundle.loadView(fromNib: "VideoViewSampleBufferDisplayView", withType: SampleBufferDisplayView.self) var remoteVideo = Bundle.loadView(fromNib: "VideoView", withType: VideoView.self) - var customCamera:AgoraYUVImageSourcePush? + var customCamera: AgoraYUVImageSourcePush? @IBOutlet weak var container: AGEVideoContainer! var agoraKit: AgoraRtcEngineKit! @@ -82,7 +82,7 @@ class CustomVideoSourcePushMain: BaseViewController { guard let channelName = configs["channelName"] as? String else {return} // make myself a broadcaster - //agoraKit.setClientRole(.broadcaster) + // agoraKit.setClientRole(.broadcaster) // enable video module and set up video encoding configs agoraKit.enableVideo() @@ -92,15 +92,17 @@ class CustomVideoSourcePushMain: BaseViewController { // note setupLocalVideo is not working when using pushExternalVideoFrame // so you will have to prepare the preview yourself customCamera = AgoraYUVImageSourcePush(size: CGSize(width: 320, height: 180), - fileName: "sample" , + fileName: "sample", frameRate: 15) customCamera?.delegate = self customCamera?.startSource() + customCamera?.trackId = 0 agoraKit.setExternalVideoSource(true, useTexture: true, sourceType: .videoFrame) let resolution = (GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize) ?? .zero let fps = (GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate) ?? .fps15 - let orientation = (GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait + let orientation = (GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, frameRate: fps, bitrate: AgoraVideoBitrateStandard, @@ -125,8 +127,8 @@ class CustomVideoSourcePushMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -168,8 +170,8 @@ extension CustomVideoSourcePushMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -234,8 +236,8 @@ extension CustomVideoSourcePushMain: AgoraYUVImageSourcePushDelegate { videoFrame.format = 12 videoFrame.textureBuf = buffer videoFrame.rotation = Int32(rotation) - //once we have the video frame, we can push to agora sdk - agoraKit?.pushExternalVideoFrame(videoFrame) + // once we have the video frame, we can push to agora sdk + agoraKit.pushExternalVideoFrame(videoFrame, videoTrackId: trackId) let outputVideoFrame = AgoraOutputVideoFrame() outputVideoFrame.width = Int32(size.width) diff --git a/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePushMulti/CustomVideoSourcePushMulti.swift b/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePushMulti/CustomVideoSourcePushMulti.swift index 76608f83d..6d40f4563 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePushMulti/CustomVideoSourcePushMulti.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/CustomVideoSourcePushMulti/CustomVideoSourcePushMulti.swift @@ -20,8 +20,7 @@ class UserModel { var customEncodeSource: KFMP4Demuxer? } -class CustomVideoSourcePushMultiEntry : UIViewController -{ +class CustomVideoSourcePushMultiEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "CustomVideoSourcePushMulti" @@ -31,15 +30,17 @@ class CustomVideoSourcePushMultiEntry : UIViewController } @IBAction func doJoinPressed(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName] + newViewController.configs = ["channelName": channelName] self.navigationController?.pushViewController(newViewController, animated: true) } } @@ -54,7 +55,7 @@ class CustomVideoSourcePushMultiMain: BaseViewController { withType: SampleBufferDisplayView.self) return model }) - var customCamera:AgoraYUVImageSourcePush? + var customCamera: AgoraYUVImageSourcePush? @IBOutlet weak var container: AGEVideoContainer! var agoraKit: AgoraRtcEngineKit! @@ -78,7 +79,7 @@ class CustomVideoSourcePushMultiMain: BaseViewController { container.layoutStream2x2(views: [localVideo] + remoteVideos.compactMap({ $0.canvasView })) // make myself a broadcaster - //agoraKit.setClientRole(.broadcaster) + // agoraKit.setClientRole(.broadcaster) // enable video module and set up video encoding configs agoraKit.enableVideo() @@ -88,7 +89,7 @@ class CustomVideoSourcePushMultiMain: BaseViewController { // note setupLocalVideo is not working when using pushExternalVideoFrame // so you will have to prepare the preview yourself customCamera = AgoraYUVImageSourcePush(size: CGSize(width: 320, height: 180), - fileName: "sample" , + fileName: "sample", frameRate: 15) customCamera?.trackId = agoraKit.createCustomVideoTrack() customCamera?.delegate = self @@ -97,7 +98,8 @@ class CustomVideoSourcePushMultiMain: BaseViewController { let resolution = (GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize) ?? .zero let fps = (GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate) ?? .fps15 - let orientation = (GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait + let orientation = (GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, frameRate: fps, bitrate: AgoraVideoBitrateStandard, @@ -139,8 +141,8 @@ class CustomVideoSourcePushMultiMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } } @@ -156,7 +158,8 @@ class CustomVideoSourcePushMultiMain: BaseViewController { userModel.isEncode = true userModel.isJoin = true } - NetworkManager.shared.download(urlString: "https://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/resources/sample.mp4") { response in + let urlString = "https://agora-adc-artifacts.s3.cn-north-1.amazonaws.com.cn/resources/sample.mp4" + NetworkManager.shared.download(urlString: urlString) { response in let path = response["path"] as? String let config = KFDemuxerConfig() config.demuxerType = .video @@ -171,9 +174,9 @@ class CustomVideoSourcePushMultiMain: BaseViewController { info.frameType = .keyFrame info.framesPerSecond = 30 info.codecType = .H264 - self.agoraKit.pushExternalEncodedVideoFrameEx(data, - info: info, - videoTrackId: UInt(userModel.trackId)) + self.agoraKit.pushExternalEncodedVideoFrame(data, + info: info, + videoTrackId: UInt(userModel.trackId)) userModel.canvasView?.videoView.renderVideoSampleBuffer(sampleBuffer, size: demuxer?.videoSize ?? .zero) } } failure: { error in @@ -192,7 +195,7 @@ class CustomVideoSourcePushMultiMain: BaseViewController { private func createVideoTrack(userModel: UserModel) { let customCamera = AgoraYUVImageSourcePush(size: CGSize(width: 320, height: 180), - fileName: "sample" , + fileName: "sample", frameRate: 15) customCamera.trackId = userModel.trackId customCamera.delegate = self @@ -268,8 +271,8 @@ extension CustomVideoSourcePushMultiMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -291,10 +294,8 @@ extension CustomVideoSourcePushMultiMain: AgoraRtcEngineDelegate { // tutorial. Here we check if there exists a surface // view tagged as this uid. if uid == 999 { return } - for model in remoteVideos { - if model.uid == uid { - return - } + for model in remoteVideos where model.uid == uid { + return } let videoCanvas = AgoraRtcVideoCanvas() videoCanvas.uid = uid @@ -353,7 +354,7 @@ extension CustomVideoSourcePushMultiMain: AgoraYUVImageSourcePushDelegate { let userModel = remoteVideos.first(where: { $0.trackId == trackId }) userModel?.canvasView?.videoView.renderVideoPixelBuffer(outputVideoFrame) } - //once we have the video frame, we can push to agora sdk + // once we have the video frame, we can push to agora sdk agoraKit?.pushExternalVideoFrame(videoFrame, videoTrackId: trackId) } } diff --git a/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/Base.lproj/FaceCapture.storyboard b/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/Base.lproj/FaceCapture.storyboard new file mode 100644 index 000000000..e46a4510d --- /dev/null +++ b/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/Base.lproj/FaceCapture.storyboard @@ -0,0 +1,100 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/FaceCapture.swift b/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/FaceCapture.swift new file mode 100644 index 000000000..cbe3cdaa3 --- /dev/null +++ b/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/FaceCapture.swift @@ -0,0 +1,290 @@ +// +// JoinChannelVideo.swift +// APIExample +// +// Created by 寮犱咕娉 on 2020/4/17. +// Copyright 漏 2020 Agora Corp. All rights reserved. +// +import UIKit +import AGEVideoLayout +import AgoraRtcKit + +class FaceCaptureEntry: UIViewController { + @IBOutlet weak var joinButton: UIButton! + @IBOutlet weak var channelTextField: UITextField! + let identifier = "FaceCapture" + @IBOutlet var resolutionBtn: UIButton! + @IBOutlet var fpsBtn: UIButton! + @IBOutlet var orientationBtn: UIButton! + var width: Int = 960, height: Int = 540, orientation: AgoraVideoOutputOrientationMode = .adaptative, fps = 15 + + override func viewDidLoad() { + super.viewDidLoad() + } + + func getResolutionAction(width: Int, height: Int) -> UIAlertAction { + return UIAlertAction(title: "\(width)x\(height)", style: .default, handler: {[unowned self] _ in + self.width = width + self.height = height + self.resolutionBtn.setTitle("\(width)x\(height)", for: .normal) + }) + } + + func getFpsAction(_ fps: Int) -> UIAlertAction { + return UIAlertAction(title: "\(fps)fps", style: .default, handler: { [unowned self] _ in + self.fps = fps + self.fpsBtn.setTitle("\(fps)fps", for: .normal) + }) + } + + func getOrientationAction(_ orientation: AgoraVideoOutputOrientationMode) -> UIAlertAction { + return UIAlertAction(title: "\(orientation.description())", style: .default, handler: { [unowned self] _ in + self.orientation = orientation + self.orientationBtn.setTitle("\(orientation.description())", for: .normal) + }) + } + + @IBAction func setResolution() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Resolution".localized, + message: nil, + preferredStyle: style) + alert.addAction(getResolutionAction(width: 90, height: 90)) + alert.addAction(getResolutionAction(width: 160, height: 120)) + alert.addAction(getResolutionAction(width: 320, height: 240)) + alert.addAction(getResolutionAction(width: 960, height: 540)) + alert.addAction(getResolutionAction(width: 1280, height: 720)) + alert.addCancelAction() + present(alert, animated: true, completion: nil) + } + + @IBAction func setFps() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Fps".localized, + message: nil, + preferredStyle: style) + alert.addAction(getFpsAction(10)) + alert.addAction(getFpsAction(15)) + alert.addAction(getFpsAction(24)) + alert.addAction(getFpsAction(30)) + alert.addAction(getFpsAction(60)) + alert.addCancelAction() + present(alert, animated: true, completion: nil) + } + + @IBAction func setOrientation() { + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Set Orientation".localized, + message: nil, + preferredStyle: style) + alert.addAction(getOrientationAction(.adaptative)) + alert.addAction(getOrientationAction(.fixedLandscape)) + alert.addAction(getOrientationAction(.fixedPortrait)) + alert.addCancelAction() + present(alert, animated: true, completion: nil) + } + + @IBAction func doJoinPressed(sender: UIButton) { + guard let channelName = channelTextField.text else { return } + // resign channel text field + channelTextField.resignFirstResponder() + + let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) + // create new view controller every time to ensure we get a clean vc + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + newViewController.title = channelName + newViewController.configs = ["channelName": channelName, + "resolution": CGSize(width: width, height: height), + "fps": fps, + "orientation": orientation] + navigationController?.pushViewController(newViewController, animated: true) + } +} + +class FaceCaptureMain: BaseViewController { + var localVideo = Bundle.loadVideoView(type: .local, audioOnly: false) + + @IBOutlet weak var container: AGEVideoContainer! + var agoraKit: AgoraRtcEngineKit! + + // indicate if current instance has joined channel + var isJoined: Bool = false + + override func viewDidLoad() { + super.viewDidLoad() + // layout render view + localVideo.setPlaceholder(text: "Local Host".localized) + container.layoutStream(views: [localVideo]) + + // set up agora instance when view loaded + let config = AgoraRtcEngineConfig() + config.appId = KeyCenter.AppId + config.areaCode = GlobalSettings.shared.area + config.channelProfile = .liveBroadcasting + agoraKit = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) + // Configuring Privatization Parameters + Util.configPrivatization(agoraKit: agoraKit) + + agoraKit.setLogFile(LogUtils.sdkLogPath()) + + // get channel name from configs + guard let channelName = configs["channelName"] as? String, + let resolution = configs["resolution"] as? CGSize, + let fps = configs["fps"] as? Int, + let orientation = configs["orientation"] as? AgoraVideoOutputOrientationMode else {return} + + // make myself a broadcaster + agoraKit.setClientRole(GlobalSettings.shared.getUserRole()) + // enable video module and set up video encoding configs + agoraKit.enableVideo() + agoraKit.enableAudio() + agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, + frameRate: AgoraVideoFrameRate(rawValue: fps) ?? .fps15, + bitrate: AgoraVideoBitrateStandard, + orientationMode: orientation, mirrorMode: .auto)) + + if (KeyCenter.FaceCaptureLicense ?? "").isEmpty { + showAlert(message: "Please contact Agora customer service to obtain a face capture certificate".localized) + } else { + // enable face capture + agoraKit.enableExtension(withVendor: "agora_video_filters_face_capture", + extension: "face_capture", + enabled: true, + sourceType: .primaryCamera) + + agoraKit.setExtensionPropertyWithVendor("agora_video_filters_face_capture", + extension: "face_capture", + key: "authentication_information", + value: "{\"company_id\":\"agoraTest\"," + + "\"license\":\"" + (KeyCenter.FaceCaptureLicense ?? "") + "\"}", + sourceType: .primaryCamera) + agoraKit.setVideoFrameDelegate(self) + } + + // set up local video to render your local camera preview + let videoCanvas = AgoraRtcVideoCanvas() + videoCanvas.uid = 0 + // the view to be binded + videoCanvas.view = localVideo.videoView + videoCanvas.renderMode = .hidden + agoraKit.setupLocalVideo(videoCanvas) + // you have to call startPreview to see local video + agoraKit.startPreview() + + // Set audio route to speaker + agoraKit.setDefaultAudioRouteToSpeakerphone(true) + + // start joining channel + // 1. Users can only see each other after they join the + // same channel successfully using the same app id. + // 2. If app certificate is turned on at dashboard, token is needed + // when joining channel. The channel name and uid used to calculate + // the token has to match the ones used for channel join + let option = AgoraRtcChannelMediaOptions() + option.publishCameraTrack = GlobalSettings.shared.getUserRole() == .broadcaster + option.publishMicrophoneTrack = GlobalSettings.shared.getUserRole() == .broadcaster + option.clientRoleType = GlobalSettings.shared.getUserRole() + NetworkManager.shared.generateToken(channelName: channelName, success: { token in + let result = self.agoraKit.joinChannel(byToken: token, channelId: channelName, uid: 0, mediaOptions: option) + if result != 0 { + // Usually happens with invalid parameters + // Error code description can be found at: + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code + self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") + } + }) + } + + override func viewDidDisappear(_ animated: Bool) { + super.viewDidDisappear(animated) + agoraKit.disableAudio() + agoraKit.disableVideo() + if isJoined { + agoraKit.stopPreview() + agoraKit.leaveChannel { (stats) -> Void in + LogUtils.log(message: "left channel, duration: \(stats.duration)", level: .info) + } + } + } +} + +extension FaceCaptureMain: AgoraVideoFrameDelegate { + func onCapture(_ videoFrame: AgoraOutputVideoFrame, sourceType: AgoraVideoSourceType) -> Bool { + let info = videoFrame.metaInfo["KEY_FACE_CAPTURE"] as? String + localVideo.statsInfo?.updateMetaInfo(data: info) + return true + } + func getVideoFrameProcessMode() -> AgoraVideoFrameProcessMode { + .readWrite + } + func getObservedFramePosition() -> AgoraVideoFramePosition { + .postCapture + } +} + +/// agora rtc engine delegate events +extension FaceCaptureMain: AgoraRtcEngineDelegate { + /// callback when warning occured for agora sdk, warning can usually be ignored, still it's nice to check out + /// what is happening + /// Warning code description can be found at: + /// en: https://api-ref.agora.io/en/voice-sdk/ios/3.x/Constants/AgoraWarningCode.html + /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraWarningCode.html + /// @param warningCode warning code of the problem + func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurWarning warningCode: AgoraWarningCode) { + LogUtils.log(message: "warning: \(warningCode.description)", level: .warning) + } + + /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand + /// to let user know something wrong is happening + /// Error code description can be found at: + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code + /// @param errorCode error code of the problem + func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { + LogUtils.log(message: "error: \(errorCode)", level: .error) + self.showAlert(title: "Error", message: "Error \(errorCode.description) occur") + } + + func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { + self.isJoined = true + LogUtils.log(message: "Join \(channel) with uid \(uid) elapsed \(elapsed)ms", level: .info) + } + + /// callback when a remote user is joinning the channel, note audience in live broadcast mode will NOT trigger this event + /// @param uid uid of remote joined user + /// @param elapsed time elapse since current sdk instance join the channel in ms + func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinedOfUid uid: UInt, elapsed: Int) { + LogUtils.log(message: "remote user join: \(uid) \(elapsed)ms", level: .info) + } + + /// callback when a remote user is leaving the channel, note audience in live broadcast mode will NOT trigger this event + /// @param uid uid of remote joined user + /// @param reason reason why this user left, note this event may be triggered when the remote user + /// become an audience in live broadcasting profile + func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) + + // to unlink your view from sdk, so that your view reference will be released + // note the video will stay at its last frame, to completely remove it + // you will need to remove the EAGL sublayer from your binded view + let videoCanvas = AgoraRtcVideoCanvas() + videoCanvas.uid = uid + // the view to be binded + videoCanvas.view = nil + videoCanvas.renderMode = .hidden + agoraKit.setupRemoteVideo(videoCanvas) + } + + /// Reports the statistics of the current call. The SDK triggers this callback once every two seconds after the user joins the channel. + /// @param stats stats struct + func rtcEngine(_ engine: AgoraRtcEngineKit, reportRtcStats stats: AgoraChannelStats) { + localVideo.statsInfo?.updateChannelStats(stats) + } + + /// Reports the statistics of the uploading local audio streams once every two seconds. + /// @param stats stats struct + func rtcEngine(_ engine: AgoraRtcEngineKit, localAudioStats stats: AgoraRtcLocalAudioStats) { + localVideo.statsInfo?.updateLocalAudioStats(stats) + } +} diff --git a/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/zh-Hans.lproj/FaceCapture.strings b/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/zh-Hans.lproj/FaceCapture.strings new file mode 100644 index 000000000..25a97ee8c --- /dev/null +++ b/iOS/APIExample/APIExample/Examples/Advanced/FaceCapture/zh-Hans.lproj/FaceCapture.strings @@ -0,0 +1,21 @@ + +/* Class = "UITextField"; placeholder = "Enter channel name"; ObjectID = "GWc-L5-fZV"; */ +"GWc-L5-fZV.placeholder" = "杈撳叆棰戦亾鍚"; + +/* Class = "UINavigationItem"; title = "Join Channel"; ObjectID = "Iy0-Dq-h5x"; */ +"Iy0-Dq-h5x.title" = "鍔犲叆棰戦亾"; + +/* Class = "UIButton"; normalTitle = "Button"; ObjectID = "VpM-9W-auG"; */ +"VpM-9W-auG.normalTitle" = "Button"; + +/* Class = "UIButton"; normalTitle = "Join"; ObjectID = "kbN-ZR-nNn"; */ +"kbN-ZR-nNn.normalTitle" = "鍔犲叆棰戦亾"; + +/* Class = "UIButton"; normalTitle = "Button"; ObjectID = "kf0-3f-UI5"; */ +"kf0-3f-UI5.normalTitle" = "Button"; + +/* Class = "UIViewController"; title = "Join Channel Video"; ObjectID = "p70-sh-D1h"; */ +"p70-sh-D1h.title" = "瑙嗛瀹炴椂閫氳瘽"; + +/* Class = "UIButton"; normalTitle = "Button"; ObjectID = "wHl-zh-dFe"; */ +"wHl-zh-dFe.normalTitle" = "Button"; diff --git a/iOS/APIExample/APIExample/Examples/Advanced/FusionCDN/FusionCDN.swift b/iOS/APIExample/APIExample/Examples/Advanced/FusionCDN/FusionCDN.swift index 9cef9ac56..f266aa967 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/FusionCDN/FusionCDN.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/FusionCDN/FusionCDN.swift @@ -25,8 +25,7 @@ enum StreamingMode { } } -class FusionCDNEntry : UIViewController -{ +class FusionCDNEntry: UIViewController { @IBOutlet weak var joinButtonHost: AGButton! @IBOutlet weak var joinButtonAudience: AGButton! @IBOutlet weak var channelTextField: AGTextField! @@ -34,23 +33,20 @@ class FusionCDNEntry : UIViewController let identifier = "FusionCDN" let hostView = "Host" let audienceView = "Audience" - var mode:StreamingMode = .agoraChannel + var mode: StreamingMode = .agoraChannel override func viewDidLoad() { super.viewDidLoad() modeBtn.setTitle("\(mode.description())", for: .normal) } - - func getStreamingMode(_ mode:StreamingMode) -> UIAlertAction { - return UIAlertAction(title: "\(mode.description())", style: .default, handler: {[unowned self] action in + func getStreamingMode(_ mode: StreamingMode) -> UIAlertAction { + return UIAlertAction(title: "\(mode.description())", style: .default, handler: { [unowned self] _ in switch mode { case .agoraChannel: channelTextField.placeholder = "Set Channel Name" - break case .cdnUrl: channelTextField.placeholder = "Set CDN URL" - break } self.mode = mode self.modeBtn.setTitle("\(mode.description())", for: .normal) @@ -66,28 +62,31 @@ class FusionCDNEntry : UIViewController } @IBAction func joinAsHost(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() - let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: hostView) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: hostView) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName, "mode":mode] + newViewController.configs = ["channelName": channelName, "mode": mode] navigationController?.pushViewController(newViewController, animated: true) } @IBAction func joinAsAudience(sender: AGButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: audienceView) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: audienceView) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName, "mode":mode] + newViewController.configs = ["channelName": channelName, "mode": mode] navigationController?.pushViewController(newViewController, animated: true) } } @@ -104,7 +103,7 @@ class FusionCDNHost: BaseViewController { var cdnStreaming: Bool = false var rtcStreaming: Bool = false var transcoding = AgoraLiveTranscoding.default() - var videoViews: [UInt:VideoView] = [:] + var videoViews: [UInt: VideoView] = [:] var videoConfig: AgoraVideoEncoderConfiguration! let localUid = UInt.random(in: 1001...2000) @@ -117,7 +116,7 @@ class FusionCDNHost: BaseViewController { // set up agora instance when view loaded let config = AgoraRtcEngineConfig() config.appId = KeyCenter.AppId -// config.areaCode = GlobalSettings.shared.area + // config.areaCode = GlobalSettings.shared.area config.channelProfile = .liveBroadcasting agoraKit = AgoraRtcEngineKit.sharedEngine(with: config, delegate: self) // Configuring Privatization Parameters @@ -130,22 +129,23 @@ class FusionCDNHost: BaseViewController { agoraKit.enableVideo() agoraKit.enableAudio() - guard let resolution = GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize, - let _ = GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate else {return} + guard let resolution = GlobalSettings.shared.getSetting(key: "resolution")? + .selectedOption().value as? CGSize else { + return + } WIDTH = Int(resolution.height > resolution.width ? resolution.width : resolution.height) HEIGHT = Int(resolution.height > resolution.width ? resolution.height : resolution.width) videoConfig = AgoraVideoEncoderConfiguration(size: resolution, frameRate: AgoraVideoFrameRate.fps15, - bitrate: AgoraVideoBitrateStandard, - orientationMode: .fixedPortrait, mirrorMode: .auto) + bitrate: AgoraVideoBitrateStandard, + orientationMode: .fixedPortrait, mirrorMode: .auto) agoraKit.setVideoEncoderConfiguration(videoConfig) agoraKit.setDirectCdnStreamingVideoConfiguration(videoConfig) agoraKit.setDirectCdnStreamingAudioConfiguration(.default) - transcoding.size = CGSize(width: WIDTH, height: HEIGHT); + transcoding.size = CGSize(width: WIDTH, height: HEIGHT) transcoding.videoFramerate = 15 - // set up local video to render your local camera preview let videoCanvas = AgoraRtcVideoCanvas() videoCanvas.uid = 0 @@ -166,22 +166,21 @@ class FusionCDNHost: BaseViewController { if mode == .agoraChannel { streamingUrl = "rtmp://push.webdemo.agoraio.cn/lbhd/\(channelName)" rtcSwitcher.isEnabled = false - } - else { + } else { streamingUrl = channelName rtcSwitcher.isHidden = true rtcSwitcherLabel.isHidden = true } } - @IBAction func onChangeRecordingVolume(_ sender:UISlider){ - let value:Int = Int(sender.value) + @IBAction func onChangeRecordingVolume(_ sender: UISlider) { + let value: Int = Int(sender.value) print("adjustRecordingSignalVolume \(value)") agoraKit.adjustRecordingSignalVolume(value) } @IBAction func setStreaming(sender: AGButton) { - if rtcStreaming{ + if rtcStreaming { stopRtcStreaming() } else if cdnStreaming { @@ -204,7 +203,7 @@ class FusionCDNHost: BaseViewController { streamingButton.setTitle("Streaming", for: .normal) streamingButton.setTitleColor(.gray, for: .normal) agoraKit.startPreview() - } else{ + } else { stopRskStreaming() resetUI() self.showAlert(title: "Error", message: "startDirectCdnStreaming failed: \(ret)") @@ -225,8 +224,8 @@ class FusionCDNHost: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -256,7 +255,6 @@ class FusionCDNHost: BaseViewController { streamingButton.setTitleColor(.blue, for: .normal) } - @IBAction func setRtcStreaming(_ sender: UISwitch) { rtcStreaming = sender.isOn if rtcStreaming { @@ -269,7 +267,7 @@ class FusionCDNHost: BaseViewController { } func sortedViews() -> [VideoView] { - return Array(videoViews.values).sorted(by: { $0.uid < $1.uid }) + Array(videoViews.values).sorted(by: { $0.uid < $1.uid }) } func updateTranscodeLayout() { @@ -282,19 +280,16 @@ class FusionCDNHost: BaseViewController { user.rect = CGRect(x: WIDTH / 2, y: 0, width: WIDTH / 2, height: HEIGHT / 2) user.uid = view.uid self.transcoding.add(user) - break case 3: let user = AgoraLiveTranscodingUser() user.rect = CGRect(x: 0, y: HEIGHT / 2, width: WIDTH / 2, height: HEIGHT / 2) user.uid = view.uid self.transcoding.add(user) - break case 4: let user = AgoraLiveTranscodingUser() user.rect = CGRect(x: WIDTH / 2, y: HEIGHT / 2, width: WIDTH / 2, height: HEIGHT / 2) user.uid = view.uid self.transcoding.add(user) - break default: LogUtils.log(message: "igored user \(view.uid) as only 2x2 video layout supported in this demo.", level: .warning) } @@ -309,8 +304,7 @@ class FusionCDNHost: BaseViewController { agoraKit.disableAudio() agoraKit.disableVideo() stopRtcStreaming() - } - else if cdnStreaming { + } else if cdnStreaming { stopRskStreaming() resetUI() } @@ -326,7 +320,7 @@ struct CDNChannelInfo { extension CDNChannelInfo { /// static function to generate 4 channels based on given channel name - static func AllChannelList(_ num:Int32) -> [CDNChannelInfo] { + static func AllChannelList(_ num: Int32) -> [CDNChannelInfo] { var channels = [CDNChannelInfo]() for index in 0.. [VideoView] { - return Array(videoViews.values).sorted(by: { $0.uid < $1.uid }) + Array(videoViews.values).sorted(by: { $0.uid < $1.uid }) } @IBAction func setRtcStreaming(sender: UISwitch) { @@ -437,11 +428,10 @@ class FusionCDNAudience: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") - } - else{ + } else { // set up local video to render your local camera preview let videoCanvas = AgoraRtcVideoCanvas() videoCanvas.uid = 0 @@ -458,8 +448,7 @@ class FusionCDNAudience: BaseViewController { self.volumeSliderLabel.isHidden = false } }) - } - else { + } else { let leaveChannelOption = AgoraLeaveChannelOptions() leaveChannelOption.stopMicrophoneRecording = false agoraKit.leaveChannel(leaveChannelOption) { stats in @@ -476,8 +465,8 @@ class FusionCDNAudience: BaseViewController { } } - @IBAction func onChangeRecordingVolume(_ sender:UISlider){ - let value:Int = Int(sender.value) + @IBAction func onChangeRecordingVolume(_ sender: UISlider) { + let value: Int = Int(sender.value) print("adjustRecordingSignalVolume \(value)") agoraKit.adjustRecordingSignalVolume(value) } @@ -491,8 +480,8 @@ class FusionCDNAudience: BaseViewController { present(alert, animated: true, completion: nil) } - func getCDNChannel(_ channel:CDNChannelInfo) -> UIAlertAction { - return UIAlertAction(title: channel.channelName, style: .default, handler: {[unowned self] action in + func getCDNChannel(_ channel: CDNChannelInfo) -> UIAlertAction { + return UIAlertAction(title: channel.channelName, style: .default, handler: { [unowned self] _ in self.cdnSelector.setTitle(channel.channelName, for: .normal) let ret = mediaPlayerKit.switchAgoraCDNLine(by: channel.index) print(ret) @@ -503,40 +492,42 @@ class FusionCDNAudience: BaseViewController { super.viewDidDisappear(animated) agoraKit.disableVideo() agoraKit.disableAudio() - agoraKit.leaveChannel { (stats) -> Void in + agoraKit.leaveChannel { stats -> Void in LogUtils.log(message: "left channel, duration: \(stats.duration)", level: .info) } AgoraRtcEngineKit.destroy() } } - extension FusionCDNHost: AgoraDirectCdnStreamingEventDelegate { - func onDirectCdnStreamingStateChanged(_ state: AgoraDirectCdnStreamingState, error: AgoraDirectCdnStreamingError, message: String?) { + func onDirectCdnStreamingStateChanged(_ state: AgoraDirectCdnStreamingState, + reason: AgoraDirectCdnStreamingReason, + message: String?) { DispatchQueue.main.async {[self] in - switch state{ + switch state { case .running: self.streamingButton.setTitle("Stop Streaming", for: .normal) self.streamingButton.setTitleColor(.red, for: .normal) cdnStreaming = true rtcSwitcher.isEnabled = true - break case .stopped: if rtcStreaming { // switch to rtc streaming when direct cdn streaming completely stopped switchToRtcStreaming() - } else{ + } else { self.streamingButton.setTitle("Start Live Streaming", for: .normal) self.streamingButton.setTitleColor(.blue, for: .normal) cdnStreaming = false } - break + case .failed: - self.showAlert(title: "Error", message: "Start Streaming failed, please go back to previous page and check the settings.") + self.showAlert(title: "Error", + message: "Start Streaming failed, please go back to previous page and check the settings.") default: - LogUtils.log(message: "onDirectCdnStreamingStateChanged: \(state.rawValue), \(error.rawValue), \(message!)", level: .info) + LogUtils.log(message: "onDirectCdnStreamingStateChanged: \(state.rawValue), \(reason.rawValue), \(message ?? "")", + level: .info) } } } @@ -598,12 +589,16 @@ extension FusionCDNHost: AgoraRtcEngineDelegate { self.container.reload(level: 0, animated: true) updateTranscodeLayout() } - - func rtcEngine(_ engine: AgoraRtcEngineKit, rtmpStreamingChangedToState url: String, state: AgoraRtmpStreamingState, errCode: AgoraRtmpStreamingErrorCode) { - LogUtils.log(message: "On rtmpStreamingChangedToState, state: \(state.rawValue), errCode: \(errCode.rawValue)", level: .info) + func rtcEngine(_ engine: AgoraRtcEngineKit, + rtmpStreamingChangedToState url: String, + state: AgoraRtmpStreamingState, + reason: AgoraRtmpStreamingReason) { + LogUtils.log(message: "On rtmpStreamingChangedToState, state: \(state.rawValue), errCode: \(reason.rawValue)", + level: .info) } - func rtcEngine(_ engine: AgoraRtcEngineKit, streamUnpublishedWithUrl url: String) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + streamUnpublishedWithUrl url: String) { switchToRtcStreaming() // set up local video to render your local camera preview // let videoCanvas = AgoraRtcVideoCanvas() @@ -615,14 +610,17 @@ extension FusionCDNHost: AgoraRtcEngineDelegate { // videoViews.removeAll() // videoViews[0] = localVideo // agoraKit.setupLocalVideo(videoCanvas) - self.container.layoutStream(views: [videoViews[0]!.videoView]) + guard let view = videoViews[0] else { return } + self.container.layoutStream(views: [view.videoView]) } /// callback when a remote user is leaving the channel, note audience in live broadcast mode will NOT trigger this event /// @param uid uid of remote joined user /// @param reason reason why this user left, note this event may be triggered when the remote user /// become an audience in live broadcasting profile - func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + didOfflineOfUid uid: UInt, + reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) let videoCanvas = AgoraRtcVideoCanvas() @@ -632,7 +630,7 @@ extension FusionCDNHost: AgoraRtcEngineDelegate { videoCanvas.renderMode = .hidden agoraKit.setupRemoteVideo(videoCanvas) - //remove remote audio view + // remove remote audio view self.videoViews.removeValue(forKey: uid) self.container.layoutStream2x2(views: sortedViews()) self.container.reload(level: 0, animated: true) @@ -705,7 +703,7 @@ extension FusionCDNAudience: AgoraRtcEngineDelegate { videoCanvas.renderMode = .hidden agoraKit.setupRemoteVideo(videoCanvas) - //remove remote audio view + // remove remote audio view self.videoViews.removeValue(forKey: uid) self.container.layoutStream2x2(views: sortedViews()) self.container.reload(level: 0, animated: true) @@ -731,18 +729,20 @@ extension FusionCDNAudience: AgoraRtcEngineDelegate { } extension FusionCDNAudience: AgoraRtcMediaPlayerDelegate { - func AgoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, didChangedTo state: AgoraMediaPlayerState, error: AgoraMediaPlayerError) { - LogUtils.log(message: "player rtc channel publish helper state changed to: \(state.rawValue), error: \(error.rawValue)", level: .info) + func AgoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, + didChangedTo state: AgoraMediaPlayerState, + reason: AgoraMediaPlayerReason) { + LogUtils.log(message: "player rtc channel publish helper state changed to: \(state.rawValue), error: \(reason.rawValue)", level: .info) DispatchQueue.main.async {[weak self] in guard let weakself = self else { return } switch state { case .failed: - weakself.showAlert(message: "media player error: \(error.rawValue)") - break + weakself.showAlert(message: "media player error: \(reason.rawValue)") + case .openCompleted: weakself.mediaPlayerKit.play() guard let mode = weakself.configs["mode"] as? StreamingMode else {return} - if (mode == .agoraChannel){ + if mode == .agoraChannel { let num = weakself.mediaPlayerKit.getAgoraCDNLineCount() if num > 0 { weakself.channelNumber = num @@ -752,29 +752,27 @@ extension FusionCDNAudience: AgoraRtcMediaPlayerDelegate { } weakself.rtcSwitcher.isEnabled = true } - break - case .stopped: - break + case .stopped: break default: break } } } - func AgoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, didOccur event: AgoraMediaPlayerEvent, elapsedTime time: Int, message: String?) { - DispatchQueue.main.async {[weak self] in + func AgoraRtcMediaPlayer(_ playerKit: AgoraRtcMediaPlayerProtocol, + didOccur event: AgoraMediaPlayerEvent, + elapsedTime time: Int, + message: String?) { + DispatchQueue.main.async { [weak self] in guard let weakself = self else { return } - switch event{ + switch event { case .switchError: weakself.showAlert(message: "switch cdn channel error!: \(message ?? "")") - break + case .switchComplete: weakself.showAlert(message: "switch cdn channel complete!") - break default: break - } } } } - diff --git a/iOS/APIExample/APIExample/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.swift b/iOS/APIExample/APIExample/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.swift index eb744c38f..5aef7ed0f 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/JoinMultiChannel/JoinMultiChannel.swift @@ -9,8 +9,7 @@ import UIKit import AGEVideoLayout import AgoraRtcKit -class JoinMultiChannelEntry : UIViewController -{ +class JoinMultiChannelEntry: UIViewController { @IBOutlet weak var joinButton: AGButton! @IBOutlet weak var channelTextField: AGTextField! let identifier = "JoinMultiChannel" @@ -21,14 +20,16 @@ class JoinMultiChannelEntry : UIViewController @IBAction func doJoinPressed(sender: AGButton) { guard let channelName = channelTextField.text else {return} - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName] + newViewController.configs = ["channelName": channelName] self.navigationController?.pushViewController(newViewController, animated: true) } } @@ -77,7 +78,6 @@ class JoinMultiChannelMain: BaseViewController { Util.configPrivatization(agoraKit: agoraKit) agoraKit.setLogFile(LogUtils.sdkLogPath()) - // layout render view localVideo.setPlaceholder(text: "Local Host".localized) channel1RemoteVideo.setPlaceholder(text: "\(channelName1 )\nRemote Host") @@ -90,7 +90,8 @@ class JoinMultiChannelMain: BaseViewController { agoraKit.enableAudio() let resolution = (GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize) ?? .zero let fps = (GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate) ?? .fps15 - let orientation = (GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait + let orientation = (GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, frameRate: fps, bitrate: AgoraVideoBitrateStandard, @@ -113,8 +114,13 @@ class JoinMultiChannelMain: BaseViewController { // you have to call startPreview to see local video agoraKit.startPreview() + joinChannel1() + joinChannel2() + } + + private func joinChannel1() { // join channel1 - var mediaOptions = AgoraRtcChannelMediaOptions() + let mediaOptions = AgoraRtcChannelMediaOptions() // publish audio and camera track for channel 1 mediaOptions.publishCameraTrack = false mediaOptions.publishMicrophoneTrack = false @@ -127,20 +133,19 @@ class JoinMultiChannelMain: BaseViewController { uid: CONNECTION_1_UID, mediaOptions: mediaOptions, joinSuccess: nil) - -// self.agoraKit.setExternalAudioSource(true, sampleRate: 44100, channels: 2, sourceNumber: 3, localPlayback: false, publish: true) if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel1 call failed: \(result), please check your params") } } - - + } + + private func joinChannel2() { // join channel2 - mediaOptions = AgoraRtcChannelMediaOptions() + let mediaOptions = AgoraRtcChannelMediaOptions() mediaOptions.publishMicrophoneTrack = true mediaOptions.publishCameraTrack = true mediaOptions.autoSubscribeVideo = true @@ -158,8 +163,8 @@ class JoinMultiChannelMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel2 call failed: \(result), please check your params") } } @@ -193,6 +198,7 @@ class JoinMultiChannelMain: BaseViewController { group.notify(queue: .main) { [weak self] in self?.agoraKit.leaveChannelEx(connection2, options: channelOptions, leaveChannelBlock: nil) + self?.channel2.remoteUid = 0 } } @@ -214,7 +220,7 @@ class JoinMultiChannelMain: BaseViewController { class Channel1Delegate: NSObject, AgoraRtcEngineDelegate { var channelId: String? - var view:VideoView? + var view: VideoView? func rtcEngine(_ engine: AgoraRtcEngineKit, didJoinChannel channel: String, withUid uid: UInt, elapsed: Int) { @@ -261,15 +267,15 @@ class Channel1Delegate: NSObject, AgoraRtcEngineDelegate { engine.setupRemoteVideoEx(videoCanvas, connection: connection) } - func rtcEngine(_ engine: AgoraRtcEngineKit, localAudioStateChanged state: AgoraAudioLocalState, error: AgoraAudioLocalError) { + func rtcEngine(_ engine: AgoraRtcEngineKit, localAudioStateChanged state: AgoraAudioLocalState, reason: AgoraAudioLocalReason) { print("localAudioStateChanged == \(state.rawValue)") } } /// agora rtc engine delegate events class Channel2Delegate: NSObject, AgoraRtcEngineDelegate { - var channelId:String? - var view:VideoView? + var channelId: String? + var view: VideoView? var remoteUid: UInt = 0 func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurWarning warningCode: AgoraWarningCode) { @@ -326,7 +332,7 @@ class Channel2Delegate: NSObject, AgoraRtcEngineDelegate { remoteUid = 0 } - func rtcEngine(_ engine: AgoraRtcEngineKit, localAudioStateChanged state: AgoraAudioLocalState, error: AgoraAudioLocalError) { + func rtcEngine(_ engine: AgoraRtcEngineKit, localAudioStateChanged state: AgoraAudioLocalState, reason: AgoraAudioLocalReason) { print("localAudioStateChanged == \(state.rawValue)") } } diff --git a/iOS/APIExample/APIExample/Examples/Advanced/KtvCopyrightMusic/KtvCopyrightMusic.swift b/iOS/APIExample/APIExample/Examples/Advanced/KtvCopyrightMusic/KtvCopyrightMusic.swift index dbde58772..2908e9d80 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/KtvCopyrightMusic/KtvCopyrightMusic.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/KtvCopyrightMusic/KtvCopyrightMusic.swift @@ -15,7 +15,6 @@ class KtvCopyrightMusic: UIViewController { override func viewDidLoad() { super.viewDidLoad() - } @IBAction func onTapKtvCopyrightButton(_ sender: Any) { guard let url = URL(string: urlString) else { return } diff --git a/iOS/APIExample/APIExample/Examples/Advanced/LiveStreaming/LiveStreaming.swift b/iOS/APIExample/APIExample/Examples/Advanced/LiveStreaming/LiveStreaming.swift index 0d3f034a2..ce0e34b17 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/LiveStreaming/LiveStreaming.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/LiveStreaming/LiveStreaming.swift @@ -9,13 +9,12 @@ import UIKit import AGEVideoLayout import AgoraRtcKit -class LiveStreamingEntry : UIViewController -{ +class LiveStreamingEntry: UIViewController { @IBOutlet weak var joinButton: UIButton! @IBOutlet weak var preloadButton: UIButton! @IBOutlet weak var channelTextField: UITextField! let identifier = "LiveStreaming" - var role:AgoraClientRole = .broadcaster + var role: AgoraClientRole = .broadcaster private var isFirstFrame: Bool = false private var backgroundColor: UInt32 = 0x000000 @@ -25,8 +24,8 @@ class LiveStreamingEntry : UIViewController preloadButton.setTitle("cancel preload".localized, for: .selected) } - func getRoleAction(_ role: AgoraClientRole) -> UIAlertAction{ - return UIAlertAction(title: "\(role.description())", style: .default, handler: {[unowned self] action in + func getRoleAction(_ role: AgoraClientRole) -> UIAlertAction { + return UIAlertAction(title: "\(role.description())", style: .default, handler: { [unowned self] _ in self.role = role self.doJoin() }) @@ -42,7 +41,10 @@ class LiveStreamingEntry : UIViewController @IBAction func doOptimizeFirstFrameSwitch(_ sender: UISwitch) { if sender.isOn { - let alertVC = UIAlertController(title: "After this function is enabled, it cannot be disabled and takes effect only when both the primary and secondary ends are enabled".localized, + // swiftlint:disable line_length + let title = "After this function is enabled, it cannot be disabled and takes effect only when both the primary and secondary ends are enabled".localized + // swiftlint:enable line_length + let alertVC = UIAlertController(title: title, message: nil, preferredStyle: .alert) @@ -77,12 +79,14 @@ class LiveStreamingEntry : UIViewController } @IBAction func doJoinPressed(sender: UIButton) { - guard let _ = channelTextField.text else {return} - //resign channel text field + // resign channel text field channelTextField.resignFirstResponder() - //display role picker - let alert = UIAlertController(title: "Pick Role".localized, message: nil, preferredStyle: UIDevice.current.userInterfaceIdiom == .pad ? UIAlertController.Style.alert : UIAlertController.Style.actionSheet) + // display role picker + let style: UIAlertController.Style = UIDevice.current.userInterfaceIdiom == .pad ? .alert : .actionSheet + let alert = UIAlertController(title: "Pick Role".localized, + message: nil, + preferredStyle: style) alert.addAction(getRoleAction(.broadcaster)) alert.addAction(getRoleAction(.audience)) alert.addCancelAction() @@ -90,13 +94,14 @@ class LiveStreamingEntry : UIViewController } func doJoin() { - guard let channelName = channelTextField.text else {return} + guard let channelName = channelTextField.text else { return } let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName, - "role":self.role, + newViewController.configs = ["channelName": channelName, + "role": self.role, "isFirstFrame": isFirstFrame, "isPreloadChannel": preloadButton.isSelected, "backgroundColor": backgroundColor] @@ -107,12 +112,12 @@ class LiveStreamingEntry : UIViewController class LiveStreamingMain: BaseViewController { var foregroundVideo = Bundle.loadVideoView(type: .local, audioOnly: false) var backgroundVideo = Bundle.loadVideoView(type: .remote, audioOnly: false) - @IBOutlet weak var foregroundVideoContainer:UIView! - @IBOutlet weak var backgroundVideoContainer:UIView! - @IBOutlet weak var clientRoleToggleView:UIView! - @IBOutlet weak var ultraLowLatencyToggleView:UIView! - @IBOutlet weak var clientRoleToggle:UISwitch! - @IBOutlet weak var ultraLowLatencyToggle:UISwitch! + @IBOutlet weak var foregroundVideoContainer: UIView! + @IBOutlet weak var backgroundVideoContainer: UIView! + @IBOutlet weak var clientRoleToggleView: UIView! + @IBOutlet weak var ultraLowLatencyToggleView: UIView! + @IBOutlet weak var clientRoleToggle: UISwitch! + @IBOutlet weak var ultraLowLatencyToggle: UISwitch! @IBOutlet weak var takeSnapshot: UIButton! @IBOutlet weak var watarMarkContainer: UIView! @IBOutlet weak var dualStreamContainer: UIView! @@ -121,7 +126,6 @@ class LiveStreamingMain: BaseViewController { @IBOutlet weak var codingSegment: UISegmentedControl! @IBOutlet weak var videoImageContainer: UIView! - var remoteUid: UInt? { didSet { foregroundVideoContainer.isHidden = !(role == .broadcaster && remoteUid != nil) @@ -216,8 +220,8 @@ class LiveStreamingMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -302,7 +306,6 @@ class LiveStreamingMain: BaseViewController { agoraKit.setVideoEncoderConfiguration(encoderConfig) } - // setup watermark @IBAction func onTapWatermarkSwitch(_ sender: UISwitch) { if sender.isOn { @@ -320,7 +323,7 @@ class LiveStreamingMain: BaseViewController { } } @IBAction func onTapDualStreamSwitch(_ sender: UISwitch) { - agoraKit.enableDualStreamMode(sender.isOn) + agoraKit.setDualStreamMode(sender.isOn ? .enableSimulcastStream : .disableSimulcastStream) dualStreamTipsLabel.text = sender.isOn ? "宸插紑鍚": "榛樿: 澶ф祦" } @@ -333,7 +336,7 @@ class LiveStreamingMain: BaseViewController { agoraKit.takeSnapshot(Int(remoteUid), filePath: path) showAlert(title: "Screenshot successful".localized, message: path) } - @IBAction func onTapForegroundVideo(_ sender:UIGestureRecognizer) { + @IBAction func onTapForegroundVideo(_ sender: UIGestureRecognizer) { isLocalVideoForeground = !isLocalVideoForeground let localVideoCanvas = AgoraRtcVideoCanvas() localVideoCanvas.uid = 0 @@ -351,14 +354,14 @@ class LiveStreamingMain: BaseViewController { } } - @IBAction func onToggleClientRole(_ sender:UISwitch) { - let role:AgoraClientRole = sender.isOn ? .broadcaster : .audience + @IBAction func onToggleClientRole(_ sender: UISwitch) { + let role: AgoraClientRole = sender.isOn ? .broadcaster : .audience updateClientRole(role) } - fileprivate func updateClientRole(_ role:AgoraClientRole) { + fileprivate func updateClientRole(_ role: AgoraClientRole) { self.role = role - if(role == .broadcaster) { + if role == .broadcaster { becomeBroadcaster() } else { becomeAudience() @@ -369,12 +372,12 @@ class LiveStreamingMain: BaseViewController { agoraKit.updateChannel(with: option) } - @IBAction func onToggleUltraLowLatency(_ sender:UISwitch) { + @IBAction func onToggleUltraLowLatency(_ sender: UISwitch) { updateUltraLowLatency(sender.isOn) } - fileprivate func updateUltraLowLatency(_ enabled:Bool) { - if(self.role == .audience) { + fileprivate func updateUltraLowLatency(_ enabled: Bool) { + if self.role == .audience { self.isUltraLowLatencyOn = enabled updateClientRole(.audience) } @@ -410,8 +413,8 @@ extension LiveStreamingMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -435,7 +438,7 @@ extension LiveStreamingMain: AgoraRtcEngineDelegate { backgroundVideo.statsInfo?.updateRemoteUid(remoteUid: uid) LogUtils.log(message: "remote user join: \(uid) \(elapsed)ms", level: .info) - //record remote uid + // record remote uid remoteUid = uid // Only one remote video view is available for this // tutorial. Here we check if there exists a surface @@ -459,8 +462,8 @@ extension LiveStreamingMain: AgoraRtcEngineDelegate { /// become an audience in live broadcasting profile func rtcEngine(_ engine: AgoraRtcEngineKit, didOfflineOfUid uid: UInt, reason: AgoraUserOfflineReason) { LogUtils.log(message: "remote user left: \(uid) reason \(reason)", level: .info) - //clear remote uid - if(remoteUid == uid){ + // clear remote uid + if remoteUid == uid { remoteUid = nil } @@ -499,7 +502,10 @@ extension LiveStreamingMain: AgoraRtcEngineDelegate { backgroundVideo.statsInfo?.updateAudioStats(stats) } - func rtcEngine(_ engine: AgoraRtcEngineKit, videoRenderingTracingResultOfUid uid: UInt, currentEvent: AgoraMediaTraceEvent, tracingInfo: AgoraVideoRenderingTracingInfo) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + videoRenderingTracingResultOfUid uid: UInt, + currentEvent: AgoraMediaTraceEvent, + tracingInfo: AgoraVideoRenderingTracingInfo) { backgroundVideo.statsInfo?.updateFirstFrameInfo(tracingInfo) } } diff --git a/iOS/APIExample/APIExample/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.swift b/iOS/APIExample/APIExample/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.swift index a8c356b74..e22ddaf0f 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.swift +++ b/iOS/APIExample/APIExample/Examples/Advanced/MediaChannelRelay/MediaChannelRelay.swift @@ -9,8 +9,7 @@ import UIKit import AGEVideoLayout import AgoraRtcKit -class MediaChannelRelayEntry : UIViewController -{ +class MediaChannelRelayEntry: UIViewController { @IBOutlet weak var joinButton: UIButton! @IBOutlet weak var channelTextField: UITextField! let identifier = "MediaChannelRelay" @@ -20,15 +19,17 @@ class MediaChannelRelayEntry : UIViewController } @IBAction func doJoinPressed(sender: UIButton) { - guard let channelName = channelTextField.text else {return} - //resign channel text field + guard let channelName = channelTextField.text else { return } + // resign channel text field channelTextField.resignFirstResponder() let storyBoard: UIStoryboard = UIStoryboard(name: identifier, bundle: nil) // create new view controller every time to ensure we get a clean vc - guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else {return} + guard let newViewController = storyBoard.instantiateViewController(withIdentifier: identifier) as? BaseViewController else { + return + } newViewController.title = channelName - newViewController.configs = ["channelName":channelName] + newViewController.configs = ["channelName": channelName] navigationController?.pushViewController(newViewController, animated: true) } } @@ -78,7 +79,6 @@ class MediaChannelRelayMain: BaseViewController { // get channel name from configs guard let channelName = configs["channelName"] as? String else {return} - // make myself a broadcaster agoraKit.setClientRole(GlobalSettings.shared.getUserRole()) @@ -88,7 +88,8 @@ class MediaChannelRelayMain: BaseViewController { let resolution = (GlobalSettings.shared.getSetting(key: "resolution")?.selectedOption().value as? CGSize) ?? .zero let fps = (GlobalSettings.shared.getSetting(key: "fps")?.selectedOption().value as? AgoraVideoFrameRate) ?? .fps15 - let orientation = (GlobalSettings.shared.getSetting(key: "orientation")?.selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait + let orientation = (GlobalSettings.shared.getSetting(key: "orientation")? + .selectedOption().value as? AgoraVideoOutputOrientationMode) ?? .fixedPortrait agoraKit.setVideoEncoderConfiguration(AgoraVideoEncoderConfiguration(size: resolution, frameRate: fps, bitrate: AgoraVideoBitrateStandard, @@ -123,8 +124,8 @@ class MediaChannelRelayMain: BaseViewController { if result != 0 { // Usually happens with invalid parameters // Error code description can be found at: - // en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - // cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + // en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + // cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code self.showAlert(title: "Error", message: "joinChannel call failed: \(result), please check your params") } }) @@ -132,10 +133,10 @@ class MediaChannelRelayMain: BaseViewController { /// start relay @IBAction func doRelay(_ sender: UIButton) { - guard let destinationChannelName = relayChannelField.text else {return} + guard let destinationChannelName = relayChannelField.text else { return } // prevent operation if target channel name is empty - if(destinationChannelName.isEmpty) { + if destinationChannelName.isEmpty { self.showAlert(message: "Destination channel name is empty") return } @@ -147,7 +148,7 @@ class MediaChannelRelayMain: BaseViewController { // configure target channel info let destinationInfo = AgoraChannelMediaRelayInfo(token: nil) config.setDestinationInfo(destinationInfo, forChannelName: destinationChannelName) - agoraKit.startChannelMediaRelay(config) + agoraKit.startOrUpdateChannelMediaRelay(config) } /// stop relay @@ -194,8 +195,8 @@ extension MediaChannelRelayMain: AgoraRtcEngineDelegate { /// callback when error occured for agora sdk, you are recommended to display the error descriptions on demand /// to let user know something wrong is happening /// Error code description can be found at: - /// en: https://api-ref.agora.io/en/voice-sdk/macos/3.x/Constants/AgoraErrorCode.html#content - /// cn: https://docs.agora.io/cn/Voice/API%20Reference/oc/Constants/AgoraErrorCode.html + /// en: https://api-ref.agora.io/en/video-sdk/ios/4.x/documentation/agorartckit/agoraerrorcode + /// cn: https://doc.shengwang.cn/api-ref/rtc/ios/error-code /// @param errorCode error code of the problem func rtcEngine(_ engine: AgoraRtcEngineKit, didOccurError errorCode: AgoraErrorCode) { LogUtils.log(message: "error: \(errorCode)", level: .error) @@ -240,21 +241,23 @@ extension MediaChannelRelayMain: AgoraRtcEngineDelegate { /// callback when a media relay process state changed /// @param state state of media relay /// @param error error details if media relay reaches failure state - func rtcEngine(_ engine: AgoraRtcEngineKit, channelMediaRelayStateDidChange state: AgoraChannelMediaRelayState, error: AgoraChannelMediaRelayError) { + func rtcEngine(_ engine: AgoraRtcEngineKit, + channelMediaRelayStateDidChange state: AgoraChannelMediaRelayState, + error: AgoraChannelMediaRelayError) { LogUtils.log(message: "channelMediaRelayStateDidChange: \(state.rawValue) error \(error.rawValue)", level: .info) - switch(state){ + switch state { case .running: isRelaying = true - break + case .failure: showAlert(message: "Media Relay Failed: \(error.rawValue)") isRelaying = false - break + case .idle: isRelaying = false - break - default:break + + default: break } } diff --git a/iOS/APIExample/APIExample/Examples/Advanced/MediaPlayer/Base.lproj/MediaPlayer.storyboard b/iOS/APIExample/APIExample/Examples/Advanced/MediaPlayer/Base.lproj/MediaPlayer.storyboard index f6f71c1c1..82bad546a 100644 --- a/iOS/APIExample/APIExample/Examples/Advanced/MediaPlayer/Base.lproj/MediaPlayer.storyboard +++ b/iOS/APIExample/APIExample/Examples/Advanced/MediaPlayer/Base.lproj/MediaPlayer.storyboard @@ -1,9 +1,9 @@ - + - + @@ -103,7 +103,7 @@ -