0

I have to record a video of app exactly similar to "Talking tom". Taking help from Here and Here i have captured screen and made a video using those images but that does not has any sound.

I have recorded both sound and video files separately but don't know how to add them

can anyone tell me how to add sound to this video or how to record it with sound.

Can anyone help ?

Community
  • 1
  • 1
sajwan
  • 333
  • 3
  • 14
  • possible duplicate of [how to record screen video as like Talking Tomcat application does in iphone?](http://stackoverflow.com/questions/6980370/how-to-record-screen-video-as-like-talking-tomcat-application-does-in-iphone) –  Oct 30 '11 at 05:24
  • 1
    yes that is similar (but not duplicate) and also have mentioned that in my link....Please read the full question before downvoting...... the problem is with adding sound.... – sajwan Oct 30 '11 at 05:31
  • did u got the solution of this..? – Rajneesh071 Nov 06 '12 at 06:23

2 Answers2

1
-(void) processVideo: (NSURL*) videoUrl{   
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];

AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];

NSError * error = nil;

for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
{
    NSString *pathString = [[NSHomeDirectory() stringByAppendingString:@”/Documents/”] stringByAppendingString: [audioInfo objectForKey: @”fileName”]];

    AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];

    AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
    AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                                                   preferredTrackID: kCMPersistentTrackID_Invalid];

    NSLog(@”%lf”, [[audioInfo objectForKey: @”startTime”] doubleValue]);

    CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: @”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);

    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];      
}


AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                                                                               preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
                               ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
                                atTime:kCMTimeZero error:nil];

AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                                                                      presetName:AVAssetExportPresetPassthrough];   

NSString* videoName = @”export.mov”;

NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

_assetExport.outputFileType = @”com.apple.quicktime-movie”;
NSLog(@”file type %@”,_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;

[_assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (_assetExport.status) 
     {
         case AVAssetExportSessionStatusCompleted:
             //export complete 
             NSLog(@”Export Complete”);
             //[self uploadToYouTube];

             break;
         case AVAssetExportSessionStatusFailed:
             NSLog(@”Export Failed”);
             NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
             //export error (see exportSession.error)  
             break;
         case AVAssetExportSessionStatusCancelled:
             NSLog(@”Export Failed”);
             NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
             //export cancelled  
             break;
     }
 }];    }

Just assign your movie file(ie.without audio) to NSURL and pass it to the above ProcessVideo method.Then just add your Sound files(you want to merge with your video) in the audioInfoArray somewhere else in the program before calling the processVideo method.Then it will merge your audio with your video file.

You can also decide where the sound starts to play in the video as per the value assigned under the key "startTime" in audioinfoArray. Using the Switch Case,you can play the video,upload to facebook etc as per your wish.

Balan Prabhu
  • 557
  • 1
  • 7
  • 21
0

An iOS app can't really record (using any public API) the sound that it itself makes. What an app can do is generate the same audio twice, one for playing, one for streaming to a file. You have to stick with only sounds that you know how to do both ways, such as copying PCM waveforms into buffers, etc.

Once you have your duplicate buffer of audio samples, there should be example code on how to send it to an AVAssetWriter.

hotpaw2
  • 70,107
  • 14
  • 90
  • 153
  • yes same what i am asking...i have created the video and sound in 2 separate files.but i don't know how to join them – sajwan Oct 30 '11 at 05:54