You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe your issue. If applicable, add screenshots to help explain your problem.
I'm facing an issue in my Flutter app related to using camera streams for real-time face detection. The app works fine on the emulator, but on a physical device, it freezes and eventually crashes when handling the continuous image stream from the camera. I've already configured camera orientation lock and added delays to avoid overload, but the problem persists. Has anyone encountered this issue before and could help me identify and resolve the cause of these crashes when dealing with camera streams on a real device?
Code sample
original code
import'dart:async';
import'dart:io';
import'dart:math';
import'package:camera/camera.dart';
import'package:flutter/foundation.dart';
import'package:flutter/material.dart';
import'package:flutter/services.dart';
import'package:flutter_easyloading/flutter_easyloading.dart';
import'package:fura_fila/utils/colors.dart';
import'package:fura_fila/views/widgets/base.dart';
import'package:google_mlkit_face_detection/google_mlkit_face_detection.dart';
classFaceDetectionScreenextendsStatefulWidget {
finalList<CameraDescription> cameras;
constFaceDetectionScreen({
Key? key,
requiredthis.cameras,
}) :super(key: key);
@override_FaceDetectionScreenStatecreateState() =>_FaceDetectionScreenState();
}
class_FaceDetectionScreenStateextendsState<FaceDetectionScreen>
withWidgetsBindingObserver {
lateCameraController _cameraController;
lateFuture<void> _initializeControllerFuture;
latefinalFaceDetector _faceDetector;
bool _isDetecting =false;
int _detectingCounter =0;
String _feedbackMessage ='Posicione seu rosto na câmera';
finalDuration _detectionInterval =constDuration(milliseconds:500);
DateTime _lastDetectionTime =DateTime.now();
@overridevoidinitState() {
super.initState();
_faceDetector =FaceDetector(
options:FaceDetectorOptions(
enableClassification:true,
minFaceSize:0.1,
),
);
WidgetsBinding.instance.addObserver(this);
_initializeControllerFuture =_initializeCamera();
}
@overridevoiddispose() {
WidgetsBinding.instance.removeObserver(this);
_cameraController.dispose();
_faceDetector.close();
super.dispose();
}
Future<void> _initializeCamera() async {
try {
_cameraController =CameraController(
widget.cameras.isNotEmpty ? widget.cameras[1] : widget.cameras.first,
ResolutionPreset.low,
enableAudio:false,
);
await _cameraController.initialize();
await _cameraController
.lockCaptureOrientation(DeviceOrientation.portraitUp);
if (_cameraController.value.isInitialized) {
_cameraController.startImageStream((CameraImage image) {
if (!_isDetecting) {
_isDetecting =true;
_detectFaces(image);
}
});
}
setState(() {});
} catch (e) {
print('Erro ao inicializar a câmera: $e');
}
}
Future<void> _detectFaces(CameraImage image) async {
if (_isDetecting ||DateTime.now().difference(_lastDetectionTime) < _detectionInterval) {
return;
}
_isDetecting =true;
_lastDetectionTime =DateTime.now();
try {
final inputImage =_convertCameraImageToInputImage(image);
finalList<Face> faces =await _faceDetector.processImage(inputImage);
setState(() {
if (faces.isNotEmpty) {
final face = faces[0];
if (!_isBackgroundWhite(image)) {
_feedbackMessage ='Certifique-se de que o fundo é branco.';
return;
}
if (_isFaceTooCloseOrFar(face)) {
_feedbackMessage ='Por favor, ajuste a distância do rosto.';
return;
}
if (_isFaceCovered(face)) {
_feedbackMessage ='Certifique-se de que o rosto não está coberto.';
return;
}
if (_isHeadCovered(face)) {
_feedbackMessage ='Algo está cobrindo a cabeça.';
return;
}
_feedbackMessage ='Rosto detectado corretamente.';
} else {
_feedbackMessage ='Nenhum rosto detectado.';
}
});
} catch (e) {
print('Erro ao detectar rostos: $e');
} finally {
_isDetecting =false;
}
}
bool_isFaceTooCloseOrFar(Face face) {
final faceWidth = face.boundingBox.width;
if (faceWidth <100|| faceWidth >300) {
returntrue;
}
returnfalse;
}
bool_isBackgroundWhite(CameraImage image) {
int whitePixelCount =0;
int totalPixelCount =0;
for (int i =0; i < image.planes[0].bytes.length; i +=4) {
final r = image.planes[0].bytes[i];
final g = image.planes[0].bytes[i +1];
final b = image.planes[0].bytes[i +2];
if (r >200&& g >200&& b >200) {
whitePixelCount++;
}
totalPixelCount++;
}
double whitePercentage = (whitePixelCount / totalPixelCount) *100;
return whitePercentage >50;
}
bool_isFaceCovered(Face face) {
final leftEye = face.landmarks[FaceLandmarkType.leftEye];
final rightEye = face.landmarks[FaceLandmarkType.rightEye];
final noseBase = face.landmarks[FaceLandmarkType.noseBase];
if (leftEye ==null|| rightEye ==null|| noseBase ==null) {
returntrue;
}
returnfalse;
}
bool_isHeadCovered(Face face) {
final forehead = face.landmarks[FaceLandmarkType.noseBase];
if (forehead ==null) {
returntrue;
}
returnfalse;
}
InputImage_convertCameraImageToInputImage(CameraImage image) {
finalWriteBuffer allBytes =WriteBuffer();
for (finalPlane plane in image.planes) {
allBytes.putUint8List(plane.bytes);
}
final bytes = allBytes.done().buffer.asUint8List();
InputImageRotation rotation;
if (_cameraController.description.lensDirection ==CameraLensDirection.front) {
rotation =InputImageRotation.rotation270deg;
} else {
rotation =InputImageRotation.rotation90deg;
}
final inputImage =InputImage.fromBytes(
bytes: bytes,
metadata:InputImageMetadata(
size:Size(image.width.toDouble(), image.height.toDouble()),
rotation: rotation,
format:InputImageFormat.nv21,
bytesPerRow: image.planes.first.bytesPerRow,
),
);
return inputImage;
}
@overrideWidgetbuild(BuildContext context) {
returnBase(
bottomButton:ElevatedButton(
style:ElevatedButton.styleFrom(
backgroundColor:AppColors.brownDark,
padding:constEdgeInsets.symmetric(vertical:16),
shape:RoundedRectangleBorder(
borderRadius:BorderRadius.circular(8),
),
),
onPressed: () async {
try {
final dialogContext = context;
await _initializeControllerFuture;
final image =await _cameraController.takePicture();
if (!mounted) return;
showDialog(
context: dialogContext,
builder: (BuildContext context) {
returnAlertDialog(
shape:RoundedRectangleBorder(
borderRadius:BorderRadius.circular(10),
),
title:constText('Foto Capturada'),
content:Column(
mainAxisSize:MainAxisSize.min,
children: [
Image.file(File(image.path)),
constSizedBox(height:20),
constText('Deseja enviar a foto ou cancelar?'),
],
),
actions: [
TextButton(
onPressed: () {
Navigator.of(dialogContext).pop();
},
child:constText('Cancelar'),
),
TextButton(
onPressed: () {},
child:constText('Enviar'),
),
],
);
},
);
} catch (e) {
print('Erro ao capturar a foto: $e');
}
},
child:constText(
'SALVAR FOTO',
style:TextStyle(
color:Colors.white,
fontWeight:FontWeight.bold,
fontSize:18,
),
),
),
children: [
Padding(
padding:constEdgeInsets.symmetric(vertical:1),
child:Row(
mainAxisAlignment:MainAxisAlignment.start,
children: [
IconButton(
icon:constIcon(Icons.arrow_back),
onPressed: () {
Navigator.of(context).pop();
},
),
Text(
_feedbackMessage,
textAlign:TextAlign.center,
style:constTextStyle(
fontWeight:FontWeight.bold,
fontSize:20,
),
),
],
),
),
constSizedBox(height:20),
Expanded(
child:FutureBuilder<void>(
future: _initializeControllerFuture,
builder: (context, snapshot) {
if (snapshot.connectionState ==ConnectionState.done) {
if (snapshot.hasError) {
returnconstCenter(child:Text('Erro ao iniciar a câmera'));
} else {
returnCameraPreview(_cameraController);
}
} else {
returnconstCenter(child:CircularProgressIndicator());
}
},
),
),
constSizedBox(height:20),
],
);
}
}
[✓] Flutter (Channel stable, 3.24.3, on macOS 14.5 23F79 darwin-arm64, locale pt-BR) • Flutter version 3.24.3 on channel stable at /Users/esiogustavopereirafreitas/development/flutter • Upstream repository https://github.com/flutter/flutter.git • Framework revision 2663184aa7 (6 weeks ago), 2024-09-11 16:27:48 -0500 • Engine revision 36335019a8 • Dart version 3.5.3 • DevTools version 2.37.3[✓] Android toolchain - develop for Android devices (Android SDK version 35.0.0) • Android SDK at /Users/esiogustavopereirafreitas/Library/Android/sdk • Platform android-35, build-tools 35.0.0 • Java binary at: /Applications/Android Studio.app/Contents/jbr/Contents/Home/bin/java • Java version OpenJDK Runtime Environment (build 17.0.11+0-17.0.11b1207.24-11852314) • All Android licenses accepted.[✓] Xcode - develop for iOS and macOS (Xcode 15.4) • Xcode at /Applications/Xcode.app/Contents/Developer • Build 15F31d • CocoaPods version 1.15.2[✓] Chrome - develop for the web • Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome[✓] Android Studio (version 2024.1) • Android Studio at /Applications/Android Studio.app/Contents • Flutter plugin can be installed from: 🔨 https://plugins.jetbrains.com/plugin/9212-flutter • Dart plugin can be installed from: 🔨 https://plugins.jetbrains.com/plugin/6351-dart • Java version OpenJDK Runtime Environment (build 17.0.11+0-17.0.11b1207.24-11852314)[✓] VS Code (version 1.91.1) • VS Code at /Users/esiogustavopereirafreitas/Downloads/Visual Studio Code.app/Contents • Flutter extension version 3.98.0[✓] Connected device (4 available) • sdk gphone64 arm64 (mobile) • emulator-5554 • android-arm64 • Android 15 (API 35) (emulator) • macOS (desktop) • macos • darwin-arm64 • macOS 14.5 23F79 darwin-arm64 • Mac Designed for iPad (desktop) • mac-designed-for-ipad • darwin • macOS 14.5 23F79 darwin-arm64 • Chrome (web) • chrome • web-javascript • Google Chrome 129.0.6668.103 ! Error: Browsing on the local area network for iPhone. Ensure the device is unlocked and attached with a cable or associated with the same local area network as this Mac. The device must be opted into Developer Mode to connect wirelessly. (code -27)[✓] Network resources • All expected network resources are available.• No issues found!
Steps to reproduce.
Steps to reproduce
1- Set up a Flutter project with camera and face detection functionality using camera and google_mlkit_face_detection packages.
2- Initialize the camera and start the image stream for real-time face detection.
3- Run the app on a physical device (the issue does not occur on the emulator).
4- Attempt to detect faces using the continuous camera stream.
5- The app may crash when handling the camera's continuous image stream.
What is the expected result?
Expected results
1- The app should run smoothly on a physical device, continuously detecting faces from the camera stream without crashing.
2- The camera stream should provide real-time data for face detection without causing any performance issues.
Actual results
1- The app works fine on an emulator, but on a physical device, it freezes and crashes when handling the continuous camera stream.
2- The app crashes even after setting delays to prevent overloading and locking the camera orientation.
Describe your issue. If applicable, add screenshots to help explain your problem.
I'm facing an issue in my Flutter app related to using camera streams for real-time face detection. The app works fine on the emulator, but on a physical device, it freezes and eventually crashes when handling the continuous image stream from the camera. I've already configured camera orientation lock and added delays to avoid overload, but the problem persists. Has anyone encountered this issue before and could help me identify and resolve the cause of these crashes when dealing with camera streams on a real device?
Code sample
original code
code to facilitate testing
Screenshots or Video
Screenshots / Video demonstration
Logs
Logs
[Paste your logs here]
Flutter Doctor output
Doctor output
Steps to reproduce.
Steps to reproduce
1- Set up a Flutter project with camera and face detection functionality using camera and google_mlkit_face_detection packages.
2- Initialize the camera and start the image stream for real-time face detection.
3- Run the app on a physical device (the issue does not occur on the emulator).
4- Attempt to detect faces using the continuous camera stream.
5- The app may crash when handling the camera's continuous image stream.
What is the expected result?
Expected results
1- The app should run smoothly on a physical device, continuously detecting faces from the camera stream without crashing.
2- The camera stream should provide real-time data for face detection without causing any performance issues.
Actual results
1- The app works fine on an emulator, but on a physical device, it freezes and crashes when handling the continuous camera stream.
2- The app crashes even after setting delays to prevent overloading and locking the camera orientation.
Did you try our example app?
Yes
Is it reproducible in the example app?
Yes
Reproducible in which OS?
iOS and Android
Flutter/Dart Version?
Flutter 3.24.3 • channel stable • https://github.com/flutter/flutter.git
Framework • revision 2663184aa7 (6 weeks ago) • 2024-09-11 16:27:48 -0500
Engine • revision 36335019a8
Tools • Dart 3.5.3 • DevTools 2.37.3
Plugin Version?
camera: ^0.11.0+2
google_mlkit_face_detection: ^0.11.0
google_mlkit_commons: ^0.8.1
camera_android_camerax: 0.6.7+2
The text was updated successfully, but these errors were encountered: