1
VNDetectFaceLandmarksRequest
で検出されたleftPupilの結果を表す円を描こうとしています。私のUIViewに `VNFaceLandmarkRegion2D`の` normalizedPoints`を描画するには?
normalizedPoints
の命名規則により、いくつかの規則で正規化されていると思います。では、ポイントをスケールしてAVCaptureVideoPreviewLayer
または他のビューにどのように変換できますか?
いくつかのビジョンチュートリアルから、私は以下のような変換を見てきました:
let faceLandmarkPoints = convertedPoints.map { (point: (x: CGFloat, y: CGFloat)) -> (x: CGFloat, y: CGFloat) in
let pointX = point.x * boundingBox.width + boundingBox.origin.x
let pointY = point.y * boundingBox.height + boundingBox.origin.y
return (x: pointX, y: pointY)
}
しかし、それは本当に私のために動作しません。私の完全なコードを以下に示します。
変容:
// Rescale the points from vision coordinates
func convertPointsForFace(landmark: VNFaceLandmarkRegion2D?, _ boundingBox: CGRect)
-> [(x: CGFloat, y: CGFloat)]?{
if let points = landmark?.normalizedPoints{
let convertedPoints = self.convertToCGFloat(points: points)
let faceLandmarkPoints = convertedPoints.map { (point: (x: CGFloat, y: CGFloat)) -> (x: CGFloat, y: CGFloat) in
let pointX = point.x * boundingBox.width + boundingBox.origin.x
let pointY = point.y * boundingBox.height + boundingBox.origin.y
return (x: pointX, y: pointY)
}
return faceLandmarkPoints
}
return nil
}
ポイントを描く:
// Draw a point on preview layer
func drawPoint(point: (x: CGFloat, y: CGFloat)){
// We use a CAShaperLayer to draw the circle
let shapeLayer = CAShapeLayer()
let circlePath = UIBezierPath(arcCenter: CGPoint(x: point.x,y: point.y),
radius: CGFloat(20),
startAngle: CGFloat(0),
endAngle:CGFloat(Double.pi * 2),
clockwise: true)
shapeLayer.path = circlePath.cgPath
shapeLayer.fillColor = UIColor.red.cgColor
shapeLayer.strokeColor = UIColor.red.cgColor
shapeLayer.lineWidth = 1.0
self.previewLayer.addSublayer(shapeLayer)
}
VNDetectFaceLandmarksRequest
完了ハンドラ:
func landmarkDetectCcompletionHandler(request: VNRequest, _ error: Error?){
if error != nil {
print("Errors in the landmark detection: \(error.debugDescription)")
}
guard let observations = request.results as? [VNFaceObservation] else {
fatalError("Failed to convert result to VNFaceObservation type")
}
for face in observations {
DispatchQueue.main.async {
[weak self] in
let faceBox = face.boundingBox
let leftEye = face.landmarks?.leftPupil
let leftEyePoints = self?.convertPointsForFace(landmark: leftEye,
faceBox)
print(leftEyePoints?.first!.x, leftEyePoints?.first!.y)
self?.drawPoint(point: (leftEyePoints?.first)!)
}
}
}
黒丸はいつも私のAVCaptureVideoPreviewLayer
の左上に表示されます。