Merge two videos into one with side-by-side composition

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

Merge two videos into one with side-by-side composition

John Crossman
Is there ffmpeg documentation on how to merge two videos into one with
side-by-side composition?

The following is an example of what I'm after. Jump to 0:30:
http://www.youtube.com/watch?v=0AJWmt5A62Y

Thank you.
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

Lou Logan
On Mon, 10 Jun 2013 09:27:22 -0700
John Crossman <[hidden email]> wrote:

> Is there ffmpeg documentation on how to merge two videos into one with
> side-by-side composition?
>
> The following is an example of what I'm after. Jump to 0:30:
> http://www.youtube.com/watch?v=0AJWmt5A62Y

You can use the pad [1] and overlay [2] video filters to place your
videos:

ffmpeg -i input1 -i input2 -filter_complex \
"[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" output

The overlay docs states:

  Be aware that frames are taken from each input video in timestamp
  order, hence, if their initial timestamps differ, it is a a good
  idea to pass the two inputs through a setpts=PTS-STARTPTS filter to
  have them begin in the same zero timestamp, as it does the example
  for the movie filter.

Resulting in:

ffmpeg -i input1 -i input2 -filter_complex \
"[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; \
[1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w" output

By default only one audio stream will be used, so if you want to combine
both input audio streams into one output audio stream you can use the
amerge [3] and pan [4] audio filters. My example has one stereo audio
stream per input and the output will have one combined stereo audio
stream:

ffmpeg -i input1 -i input2 -filter_complex \
"[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; \
[1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w; \
amerge,pan=stereo:c0<c0+c2:c1<c1+c3" output

This combination of filters is not the only method to do this, and I
may have forgotten something, so feel free to experiment.

[1] http://ffmpeg.org/ffmpeg-filters.html#pad
[2] http://ffmpeg.org/ffmpeg-filters.html#overlay-1
[3] http://ffmpeg.org/ffmpeg-filters.html#amerge
[4] http://ffmpeg.org/ffmpeg-filters.html#pan
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

John Crossman
Thank you, Lou! I'll give it a try and report results.


On Mon, Jun 10, 2013 at 12:57 PM, Lou <[hidden email]> wrote:

> On Mon, 10 Jun 2013 09:27:22 -0700
> John Crossman <[hidden email]> wrote:
>
> > Is there ffmpeg documentation on how to merge two videos into one with
> > side-by-side composition?
> >
> > The following is an example of what I'm after. Jump to 0:30:
> > http://www.youtube.com/watch?v=0AJWmt5A62Y
>
> You can use the pad [1] and overlay [2] video filters to place your
> videos:
>
> ffmpeg -i input1 -i input2 -filter_complex \
> "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" output
>
> The overlay docs states:
>
>   Be aware that frames are taken from each input video in timestamp
>   order, hence, if their initial timestamps differ, it is a a good
>   idea to pass the two inputs through a setpts=PTS-STARTPTS filter to
>   have them begin in the same zero timestamp, as it does the example
>   for the movie filter.
>
> Resulting in:
>
> ffmpeg -i input1 -i input2 -filter_complex \
> "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; \
> [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w" output
>
> By default only one audio stream will be used, so if you want to combine
> both input audio streams into one output audio stream you can use the
> amerge [3] and pan [4] audio filters. My example has one stereo audio
> stream per input and the output will have one combined stereo audio
> stream:
>
> ffmpeg -i input1 -i input2 -filter_complex \
> "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; \
> [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w; \
> amerge,pan=stereo:c0<c0+c2:c1<c1+c3" output
>
> This combination of filters is not the only method to do this, and I
> may have forgotten something, so feel free to experiment.
>
> [1] http://ffmpeg.org/ffmpeg-filters.html#pad
> [2] http://ffmpeg.org/ffmpeg-filters.html#overlay-1
> [3] http://ffmpeg.org/ffmpeg-filters.html#amerge
> [4] http://ffmpeg.org/ffmpeg-filters.html#pan
> _______________________________________________
> ffmpeg-user mailing list
> [hidden email]
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>



--
*John Crossman*
Educational Technology Services
UC Berkeley

Work: (510) 725-2969
Personal: (510) 681-5483
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

John Crossman
The side-by-side composition worked... Thank you!

But one problem: The video on the right is positioned too far (1) to the
right and (2) down. Thus, those edge portions of the video are chopped away.

Here is command I'm using:

*ffmpeg -i  videoLeft.mpg -i  videoRight.mpg -filter_complex
"[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w"  output.mpg
*

Any tips?

Thanks again,
John


On Mon, Jun 10, 2013 at 2:00 PM, John Crossman <[hidden email]>wrote:

> Thank you, Lou! I'll give it a try and report results.
>
>
> On Mon, Jun 10, 2013 at 12:57 PM, Lou <[hidden email]> wrote:
>
>> On Mon, 10 Jun 2013 09:27:22 -0700
>> John Crossman <[hidden email]> wrote:
>>
>> > Is there ffmpeg documentation on how to merge two videos into one with
>> > side-by-side composition?
>> >
>> > The following is an example of what I'm after. Jump to 0:30:
>> > http://www.youtube.com/watch?v=0AJWmt5A62Y
>>
>> You can use the pad [1] and overlay [2] video filters to place your
>> videos:
>>
>> ffmpeg -i input1 -i input2 -filter_complex \
>> "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" output
>>
>> The overlay docs states:
>>
>>   Be aware that frames are taken from each input video in timestamp
>>   order, hence, if their initial timestamps differ, it is a a good
>>   idea to pass the two inputs through a setpts=PTS-STARTPTS filter to
>>   have them begin in the same zero timestamp, as it does the example
>>   for the movie filter.
>>
>> Resulting in:
>>
>> ffmpeg -i input1 -i input2 -filter_complex \
>> "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; \
>> [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w" output
>>
>> By default only one audio stream will be used, so if you want to combine
>> both input audio streams into one output audio stream you can use the
>> amerge [3] and pan [4] audio filters. My example has one stereo audio
>> stream per input and the output will have one combined stereo audio
>> stream:
>>
>> ffmpeg -i input1 -i input2 -filter_complex \
>> "[0:v]setpts=PTS-STARTPTS, pad=iw*2:ih[bg]; \
>> [1:v]setpts=PTS-STARTPTS[fg]; [bg][fg]overlay=w; \
>> amerge,pan=stereo:c0<c0+c2:c1<c1+c3" output
>>
>> This combination of filters is not the only method to do this, and I
>> may have forgotten something, so feel free to experiment.
>>
>> [1] http://ffmpeg.org/ffmpeg-filters.html#pad
>> [2] http://ffmpeg.org/ffmpeg-filters.html#overlay-1
>> [3] http://ffmpeg.org/ffmpeg-filters.html#amerge
>> [4] http://ffmpeg.org/ffmpeg-filters.html#pan
>> _______________________________________________
>> ffmpeg-user mailing list
>> [hidden email]
>> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>>
>
>
>
> --
> *John Crossman*
> Educational Technology Services
> UC Berkeley
>
> Work: (510) 725-2969
> Personal: (510) 681-5483
>



--
*John Crossman*
Educational Technology Services
UC Berkeley

Work: (510) 725-2969
Personal: (510) 681-5483
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

Lou Logan
On Fri, 14 Jun 2013 15:16:50 -0700
John Crossman <[hidden email]> wrote:

> The side-by-side composition worked... Thank you!
>
> But one problem: The video on the right is positioned too far (1) to the
> right and (2) down. Thus, those edge portions of the video are chopped away.
>
> Here is command I'm using:
>
> *ffmpeg -i  videoLeft.mpg -i  videoRight.mpg -filter_complex
> "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w"  output.mpg
> *

Please include also include the complete ffmpeg console output.

> Any tips?
>
> Thanks again,
> John

Note that top-posting is not recommended on this mailing list.
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

John Crossman
Thanks for the quick reply. Here is the output.
*
*
*ffmpeg -i camera.mpg -i screen.mpg -filter_complex
"[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" output.mpg*

ffmpeg version 1.2.1 Copyright (c) 2000-2013 the FFmpeg developers
  built on May 28 2013 16:00:47 with Apple LLVM version 4.2
(clang-425.0.27) (based on LLVM 3.2svn)
  configuration: --prefix=/opt/local --enable-swscale --enable-avfilter
--enable-libmp3lame --enable-libvorbis --enable-libopus --enable-libtheora
--enable-libschroedinger --enable-libopenjpeg --enable-libmodplug
--enable-libvpx --enable-libspeex --enable-libfreetype
--mandir=/opt/local/share/man --enable-shared --enable-pthreads
--cc=/usr/bin/clang --arch=x86_64 --enable-yasm --enable-gpl
--enable-postproc --enable-libx264 --enable-libxvid
  libavutil      52. 18.100 / 52. 18.100
  libavcodec     54. 92.100 / 54. 92.100
  libavformat    54. 63.104 / 54. 63.104
  libavdevice    54.  3.103 / 54.  3.103
  libavfilter     3. 42.103 /  3. 42.103
  libswscale      2.  2.100 /  2.  2.100
  libswresample   0. 17.102 /  0. 17.102
  libpostproc    52.  2.100 / 52.  2.100
[mpeg @ 0x7f896a831000] max_analyze_duration 5000000 reached at 5005000
microseconds
Input #0, mpeg, from 'camera.mpg':
  Duration: 00:00:08.61, start: 0.500000, bitrate: 513 kb/s
    Stream #0:0[0x1e0]: Video: mpeg1video, yuv420p, 768x480 [SAR 1:1 DAR
8:5], 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 29.97 tbc
[mpeg @ 0x7f896a813600] max_analyze_duration 5000000 reached at 5000000
microseconds
Input #1, mpeg, from 'screen.mpg':
  Duration: 00:00:08.83, start: 0.500000, bitrate: 530 kb/s
    Stream #1:0[0x1e0]: Video: mpeg1video, yuv420p, 1024x768 [SAR 1:1 DAR
4:3], 104857 kb/s, 30 fps, 30 tbr, 90k tbn, 30 tbc
[Parsed_overlay_1 @ 0x7f896a413a60] Overlay area with coordinates x1:1024
y1:0 x2:2048 y2:768 is not completely contained within the output with size
1536x480
[mpeg @ 0x7f896a839c00] VBV buffer size not set, muxing may fail
Output #0, mpeg, to 'output.mpg':
  Metadata:
    encoder         : Lavf54.63.104
    Stream #0:0: Video: mpeg1video, yuv420p, 1536x480 [SAR 1:1 DAR 16:5],
q=2-31, 200 kb/s, 90k tbn, 29.97 tbc
Stream mapping:
  Stream #0:0 (mpeg1video) -> pad
  Stream #1:0 (mpeg1video) -> overlay:overlay
  overlay -> Stream #0:0 (mpeg1video)
Press [q] to stop, [?] for help
frame=  260 fps=109 q=31.0 Lsize=     708kB time=00:00:08.64 bitrate=
671.1kbits/s
video:703kB audio:0kB subtitle:0 global headers:0kB muxing overhead
0.709978%




On Fri, Jun 14, 2013 at 3:44 PM, Lou <[hidden email]> wrote:

> On Fri, 14 Jun 2013 15:16:50 -0700
> John Crossman <[hidden email]> wrote:
>
> > The side-by-side composition worked... Thank you!
> >
> > But one problem: The video on the right is positioned too far (1) to the
> > right and (2) down. Thus, those edge portions of the video are chopped
> away.
> >
> > Here is command I'm using:
> >
> > *ffmpeg -i  videoLeft.mpg -i  videoRight.mpg -filter_complex
> > "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w"  output.mpg
> > *
>
> Please include also include the complete ffmpeg console output.
>
> > Any tips?
> >
> > Thanks again,
> > John
>
> Note that top-posting is not recommended on this mailing list.
> _______________________________________________
> ffmpeg-user mailing list
> [hidden email]
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user
>



--
*John Crossman*
Educational Technology Services
UC Berkeley

Work: (510) 725-2969
Personal: (510) 681-5483
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

Lou Logan
On Fri, 14 Jun 2013 15:48:49 -0700
John Crossman <[hidden email]> wrote:

> Thanks for the quick reply. Here is the output.
> *
> *
> *ffmpeg -i camera.mpg -i screen.mpg -filter_complex
> "[0:v:0]pad=iw*2:ih[bg]; [bg][1:v:0]overlay=w" output.mpg*
...

> Input #0, mpeg, from 'camera.mpg':
>   Duration: 00:00:08.61, start: 0.500000, bitrate: 513 kb/s
>     Stream #0:0[0x1e0]: Video: mpeg1video, yuv420p, 768x480 [SAR 1:1 DAR
> 8:5], 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 29.97 tbc
> [mpeg @ 0x7f896a813600] max_analyze_duration 5000000 reached at 5000000
> microseconds
> Input #1, mpeg, from 'screen.mpg':
>   Duration: 00:00:08.83, start: 0.500000, bitrate: 530 kb/s
>     Stream #1:0[0x1e0]: Video: mpeg1video, yuv420p, 1024x768 [SAR 1:1 DAR
> 4:3], 104857 kb/s, 30 fps, 30 tbr, 90k tbn, 30 tbc
> [Parsed_overlay_1 @ 0x7f896a413a60] Overlay area with coordinates x1:1024
> y1:0 x2:2048 y2:768 is not completely contained within the output with size
> 1536x480

My original example assumed that your inputs were the same frame size.
Your inputs vary in size, and your padding was based on the smaller
input, so your larger input was not fitting correctly.

You can pad from the larger video and center the smaller video in the
padded area (you should check the arithmetic since it's Friday and I
just got off of work):

ffmpeg -i camera.mpg -i screen.mpg -filter_complex \
"[1:v]pad=iw*2:ih[bg];[bg][0:v]overlay=W/2+((W/2-w)/2):(H-h)/2" \
-qscale:v 2 output.mpg

Or you could add the scale filter make the inputs a more similar size.

Again, please remember that top-posting is not recommended on this
mailing list.
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

John Crossman
On Fri, Jun 14, 2013 at 6:39 PM, Lou <[hidden email]> wrote:

> My original example assumed that your inputs were the same frame size.
> Your inputs vary in size, and your padding was based on the smaller
> input, so your larger input was not fitting correctly.
>
> You can pad from the larger video and center the smaller video in the
> padded area (you should check the arithmetic since it's Friday and I
> just got off of work):
>
> ffmpeg -i camera.mpg -i screen.mpg -filter_complex \
> "[1:v]pad=iw*2:ih[bg];[bg][0:v]overlay=W/2+((W/2-w)/2):(H-h)/2" \
> -qscale:v 2 output.mpg
>
> Or you could add the scale filter make the inputs a more similar size.
>


Thanks! That tip is a big help. One remaining question:

The resulting video has no audio. There are two ways I could modify ffmpeg
command to get the audio I need:

   1. tell ffmpeg to use audio of one of the input videos
   2. pass in a third input which is the audio stream

My preference is option #1. Is there an easy answer here?

Thanks again.
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user
Reply | Threaded
Open this post in threaded view
|

Re: Merge two videos into one with side-by-side composition

Lou Logan
On Wed, 19 Jun 2013 15:55:42 -0700
John Crossman <[hidden email]> wrote:

> Thanks! That tip is a big help. One remaining question:
>
> The resulting video has no audio. There are two ways I could modify ffmpeg
> command to get the audio I need:
>
>    1. tell ffmpeg to use audio of one of the input videos

None of your inputs have audio in your console output that you provided.
If there was audio in one of the inputs then it should have
automatically been mapped to the output.

If you have audio in both inputs then see my previous answer:
<https://lists.ffmpeg.org/pipermail/ffmpeg-user/2013-June/015662.html>

>    2. pass in a third input which is the audio stream

That's easy. Just add another input. Place it after the other inputs so
your pad and overlay command still works, otherwise the mapping will be
different.
_______________________________________________
ffmpeg-user mailing list
[hidden email]
http://ffmpeg.org/mailman/listinfo/ffmpeg-user