

They even offer loop-through so you don’t have to use an extra port. These outboard recorders can handle pretty much any resolution or framerate you throw at them and don’t cost you any more processing than outputting to a second monitor or projector.

Probably my favorite way to go is recording directly to an Atomos Ninja or Blade or a BlackMagic HyperDeck Studio Mini. Similarly, the CPU expense of recording in realtime is something I’d rather spend on extra vertices or layers of processing. A nice alternative is that you can rent these via Kitsplit or similar for as little as $10 per day (24hrs) so if you have a Jitter piece you need to record in high resolution/frame rate, renting a HDMI recorder might be a good alternative.Ī lot of my work relies on feedback and live input, making a frame-by-frame capture setup a non-starter. It’s not the most cost effective method if you purchase, but it does provide consistent results at showreel level quality.
Ffmpeg mac audio recorder free#
You’ll find that because the graphics are off the machine you’re working on you can record at significantly higher frame rates and resolutions, and free up a lot of headroom for other Max work happening concurrently.


Ffmpeg mac audio recorder Patch#
When your jitter patch is all ready to record, drag the jitter window displaying your patch onto the Atmos “monitor” and set it to full screen, hit record on the Atomos and you’re away. It has a HDMI input and a HDMI output (direct loop through from the input) and works just like an external monitor to your computer (albeit a bit smaller), simply take the HDMI output of your machine and connect via HDMI cable to the Atomos Ninja V, it’s that easy. The Atomos Ninja V is actually something that gets widely used in the film industry for recording off camera b-roll and the likes, but how does it work with Jitter? - You can think of it basically as a HDMI Recorder (records to attached SSD). This was until recently, I got a hold of an Atomos Ninja V. Over the years I’ve used the same setup or similar setup to my colleagues, either directly recording in Jitter or by using syphon. Now might be a good time to brush up on the difference between matrices and textures, texture output, or even what is a texture. This will send the window content out the jit.world’s first outlet as a matrix or texture. If your patch’s render-context and window is handled by the jit.world object (which it should be), then you simply need to enable or depending on which recording technique you are using (more on that below). Fortunately the latter case has an easy solution, jit.world. However if the content is OpenGL geometry objects ( jit.gl.mesh, jit.gl.gridshape, etc), or a scene post-processed with jit.gl.pass, it’s not always easy to identify. If the content is a chain of textures or matrices, it’s simply a matter of taking the output from the last object in the chain. In most cases the answer is: “exactly what I see in my window.” OK, sure – but we need to locate exactly which object or patch-cord in our patch is drawing to the window. Before we dive into the many ways to record Jitter output, let’s take a step back and first identify what we’re trying to record.
