Peter is right.
The following code makes the concept more clear:
1)
Float64 seconds = 5;
int32_t preferredTimeScale = 600;
CMTime inTime = CMTimeMakeWithSeconds(seconds, preferredTimeScale);
CMTimeShow(inTime);
The above code gives:
{3000/600 = 5.000}
Which means a total duration of 5 seconds, with 3000 frames with a timescale of 600 frames per second.
2)
int64_t value = 10000;
int32_t preferredTimeScale = 600;
CMTime inTime = CMTimeMake(value, preferredTimeScale);
CMTimeShow(inTime);
This one gives {10000/600 = 16.667}
Which means a total duration of 16.667 seconds, with 10000 frames with a timescale of 600 frames per second.
Notice the difference between CMTimeMake(int64_t value, int32_t timescale)
and CMTimeMakeWithSeconds(Float64 seconds, int32_t preferredTimeScale)
Hope this explanation helps. For further clarifications, please don't hesitate to post further questions on this post.