2016-06-16 6 views
1

- ビデオの緑色の線を削除する方法。 ビデオをクロップするときは、その時点で2〜3回ビデオが緑色に表示されるか、緑色または赤色の点滅ラインがビデオの左または下に、または左と下の両方でビデオにミックスされます。iOS:ビデオの奇妙な緑色の線の左右を切り取る

ビデオ作物法は、それを解決するためにどのようにビデオがトリミング後に見られた作物グリーンライン、-Video

-(void)cropButton 
{ 
     CGRect cropFrame = self.cropView.croppedImageFrame; 

     //load our movie Asset 
     AVAsset *asset; 
      asset = [AVAsset assetWithURL:[NSURL fileURLWithPath:[self.videoDataArr objectAtIndex:self.selectedIndex-1]]]; 

     //create an avassetrack with our asset 
     AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

     //create a video composition and preset some settings 
     AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; 
     videoComposition.frameDuration = CMTimeMake(1, 30); 

     //create a video instruction 
     AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
     instruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration); 

     AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; 

     UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset]; 

     CGAffineTransform t1 = CGAffineTransformIdentity; 
     CGAffineTransform t2 = CGAffineTransformIdentity; 

     switch (videoOrientation) 
     { 
      case UIImageOrientationUp: 
       t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropFrame.origin.x, 0 - cropFrame.origin.y); 
       t2 = CGAffineTransformRotate(t1, M_PI_2); 
       break; 
      case UIImageOrientationDown: 
       t1 = CGAffineTransformMakeTranslation(0 - cropFrame.origin.x, clipVideoTrack.naturalSize.width - cropFrame.origin.y); // not fixed width is the real height in upside down 
       t2 = CGAffineTransformRotate(t1, - M_PI_2); 

       break; 
      case UIImageOrientationRight: 
       t1 = CGAffineTransformMakeTranslation(0 - cropFrame.origin.x, 0 - cropFrame.origin.y); 
       t2 = CGAffineTransformRotate(t1, 0); 
       break; 
      case UIImageOrientationLeft: 
       t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropFrame.origin.x, clipVideoTrack.naturalSize.height - cropFrame.origin.y); 
       t2 = CGAffineTransformRotate(t1, M_PI); 
       break; 
      default: 
       NSLog(@"no supported orientation has been found in this video"); 
       break; 
     } 

     CGAffineTransform finalTransform = t2; 
     videoComposition.renderSize = CGSizeMake(cropFrame.size.width,cropFrame.size.height); 

     [transformer setTransform:finalTransform atTime:kCMTimeZero]; 

     //add the transformer layer instructions, then add to video composition 
     instruction.layerInstructions = [NSArray arrayWithObject:transformer]; 
     videoComposition.instructions = [NSArray arrayWithObject: instruction]; 

     //Create an Export Path to store the cropped video 
     NSString * documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; 
     __block NSString *exportPath = [documentsPath stringByAppendingFormat:@"/CroppedVideo.mp4"]; 
     NSURL *exportUrl = [NSURL fileURLWithPath:exportPath]; 

     //Remove any prevouis videos at that path 
     [[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil]; 
     AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; 
     exporter.videoComposition = videoComposition; 
     exporter.outputURL = exportUrl; 
     NSLog(@"exported url : %@",exportUrl); 
     exporter.outputFileType = AVFileTypeQuickTimeMovie; 

     [exporter exportAsynchronouslyWithCompletionHandler:^ 
     { 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       switch ([exporter status]) { 
        case AVAssetExportSessionStatusCompleted: 
        { 
         self.navigationController.toolbarHidden = YES; 
         NSError *error = nil; 
         NSString *targetPath; 
          targetPath = [self.videoDataArr objectAtIndex:self.selectedIndex-1]; 

         [FILEMANAGER removeItemAtPath:targetPath error:&error]; 
         if(error) 
         { 
          NSLog(@"Error is : %@",error); 
         } 
         error = nil; 
         [FILEMANAGER moveItemAtPath:exportPath toPath:targetPath error:&error]; 
         if(error) 
         { 
          NSLog(@"Error is : %@",error); 
         } 
         self.mySAVideoRangeSlider.videoUrl = self.videourl; 
         [self.mySAVideoRangeSlider getMovieFrame]; 

        } 
         break; 
        } 
        case AVAssetExportSessionStatusFailed: 
         NSLog(@"Export failed: %@", [[exporter error] localizedDescription]); 
         break; 
        case AVAssetExportSessionStatusCancelled: 
         NSLog(@"Export canceled"); 
         break; 
        default: 
         NSLog(@"NONE"); 
         dispatch_async(dispatch_get_main_queue(), ^{ 
         }); 
         break; 
       } 
      }); 
     }]; 
    } 

答えて

3

あなたのビデオrendersize幅はこのdiscussion link

に注意してください。4.

小切手でも、又は分割する必要があります。 16,8または4で割り切れない解像度を選択した場合は、枠の下側または右側に1ピクセルの緑色の境界線が表示されることがあります。

"水平または垂直サイズが16で割り切れない場合、エンコーダは、右端または下端に適切な数の黒い「オーバーハング」サンプルを画像に貼り付けます。例えば、1920x1080でHDTVを符号化する場合、エンコーダは、行カウント1088を作成するために、ht eimage配列に8行の黒画素を付加する。

関連する問題