I explore multiple solutions, and remind you not to believe everything you read on the internet.
THE CONTEXT You've got an edit in FCP X and want to bring that into Adobe Premiere. Or, what actually happened, you (hypothetically, you may not have actually done this but someone I work with did) synchronized some video and audio from a Blackmagic Pocket Cinema Camera and a H4N in FCP X, created a stringout timeline, and want to bring that into Premiere.
THE PROBLEM Premiere can't import fcpxml's, and FCP X can't really export anything else (*cough*seriouslynotevenEDL's?*cough*). I'm not going to go into why this is, or discuss my feelings about it here. Let's keep this simple.
THE SOLUTION First off, don't listen to these people unless you want to needlessly waste lots of time and storage space by making AAF's of everything. This is not good advice. There may be specific scenario(s) where this is necessary/helpful but by no means should it be your first inclination and it definitely overlooks much simpler and versatile options.
If you've got $10 you can take the advice of Adobe or Larry Jordan and use Xto7. It'll take the fcpxml from FCP X and convert it to an XML that Premiere can read. As an added bonus it works the other way round as well. Pretty straightforward stuff.
Step 1: Spend $10 on Xto7. Step 2: Export fcpxml of your timeline from FCP X Step 3: Import fcpxml into xto7 and export a regular XML. Step 4: Import regular XML into Premiere.
Or you can save that $10 and Resolve* the situation for free. I've said it before and I'll say it again... Blackmagic's DaVinci Resolve is an amazing piece of software. It's robust, well designed, and extraordinary versatile. The free version is surprisingly full featured and should be in everyone's post toolbox. No I do not get paid by them or anything like that. I simply use it a lot, and really appreciate what it can do.
In this case Resolve understands both "standard" XML and FCP X's flavor, so it's essentially performing the same translation role as Xto7.
Step 1: Export fcpxml of your timeline from FCP X Step 2. Export fcpxml into Resolve and export a regular XML. Step 3: Import regular XML into Premiere. Step 4: Spend the $10 you saved on beer, feel smug. FURTHER THOUGHTS With any translation of a sequence from one program to another certain things are likely to be lost in translation. I'm not going to talk about what happens if your footage doesn't work in one program, or time remapping, or any of that right now. Suffice to say, a lot can go wrong. Simply for titles, transitions, and clips/edits (what I was dealing with in the situation that prompted this post), Resolve works fine. The editor had actually tried using the Resolve to translate the fcpxml but ran into an issue with synchronized clips created by FCP X's PluralEyes-esque auto syncing.
The clips were coming into Resolve fine...
After bringing the FCPXML into Resolve you can see Resolve gets the synced clip, and the source video and audio media. Note: Timecode of camera and recorder were not synced as H4n wouldn't know timecode if you jammed it down it's throat so ignore that.
but not to Premiere.
? indeed my friend.
Trying to reconnect in Premiere. Problem 1: You can't link one clip to two files (video and audio) Problem 2: It has no idea what either of those files might be.
This seems like a situation where Xto7 seemingly provides an advantage as it's user guide metions it can handle synchronized clips. The problem here isn't synchronized clips though, it's that the editor had only done a stringout and therefore hadn't set any in and out points for any of the clips. So since audio was rolling before picture and stopped after, the source audio clips were all longer than the source video clips. Therefore the synchronized "compound" clips were as long as the audio files, with black at the head and tail where video wasn't rolling. If this wasn't a stringout, but rather an actual edit, you wouldn't even have this problem as you wouldn't have any of that pre/post roll in the edit. As it was, the issue was that Premiere was expecting video for the duration of the synchronized clip, and wasn't getting it. Resolve was fine because it understands FCPXML and could react accordingly. We can prove this.
In Resolve if you try to conform the audio of the synchronized clip to the source file it works fine, but if you try to conform the video of the synchronized clip (what Premiere tries to do importing the regular XML) it fails, complaining the original video file doesn't have enough content. Now that that's settled you have two choices. If you want to maintain the synchronized clips in the bin to Premiere, all you need to do is trim the synced clips down so they are the same duration as the shorter of the two source clips (in our case the video). To put it another way, just cut off the black head and tail of the synced clips in FCP X before you output the FCPXML and everything works fine (this is what I had the editor actually do).
It drops "synchronized" from the name but I assure you it's the synced clip. If you're having trouble wrapping your brain around my explanation look carefully at the duration of all the clips in these screenshots. Hopefully your lightbulb will go off then.
If you don't care about maintaining the synchronized clips in your bin, and just want the timeline, you can simply select all the clips on the timeline in Resolve and choose "Decompose in Place" from the right click menu, which will break the synchronized clips into their source video and audio parts in place on the timeline, which will go over to Premiere as such just fine.
In Premiere after Decompose in Place was selected in Resolve. You trade your synchronized clip for the source media, but will maintain any edits on your timeline. This might actually be what you want, how am I supposed to know what the heck you're trying to do? I'm trying to educate by example.
Again, in this situation titles and transitions came over fine. As for crazy stuff like multicam and alternate takes from FCP X some of that is talked about in the user guide for Xto7 and maybe then it earns it's keep. But the free solution is so easy why not try it first to see if it works.
In conclusion, I'm probably over-explaining things as usual, but I do so in hopes you can understand things better. Because as I illustrated with that awful link to the potentially pointless AAF workflow in the solution section of this post, you should not listen to someone just because they wrote a thing. Make sure they know what they're talking about first.
*I like using resolve as a verb like people use Photoshop as a verb. It makes me giggle because it's already a verb that basically means the same thing anyways.
"How were you able to view those clips from the card? Premiere said they couldn't be played." "Oh the metadata got corrupted when the card got dumped but I Resolved them and it's all good now."
While visiting a friend I noticed an empty Premiere project open with an importing dialog box that didn't seem to be moving. When I asked him if his machine was frozen he revealed that no, it wasn't, he had simply dropped a folder containing a feature film's worth of Cinema DNG footage into Premiere and it was taking a while to import. A while in this case meant having run for about 5 hours and having a progress bar that was only barely visible.
I tried to explain his machine or Premiere would probably crash before it could finish, and that even if the import did finish he would not like or want the results. Another friend who works in video/tech was present and surprisingly agreed that although it may take a while, dragging and dropping the whole shebang into the project pane should be just fine. I did my best to resit the urge to facepalm.
He's a shooter rather than editor by trade, and the problem here is a weird one, so I don't fault him for not knowing any better. We've known each other since film school though, and he should know better than to ignore my advice on such matters!
Some 30 hours later I got a text message:
My system has run out of application memory!!! "quit any applications you are not using" I'm not using any others!
The need is to import a large amount of Blackmagic Pocket Cinema camera footage (in raw CinemaDNG format) into Premiere. There is a master folder containing a subfolder for each scene, which in turn contain the clips for that scene. It is desirable to translate this scene folder structure into bins in Premiere, as it is the only manner of organization that exists for the footage.
A quick primer on CinemaDNG if you haven't yet had the pleasure. It's much like DPX in that it stores each frame as a separate image file. So you end up with a folder for each clip that contains the image sequence taht makes up the video frames alongside other assets like in camera audio.
A clip from the BMPCC. The Clip's name is that of the folder, TV_1_2015-06-22_2130_C0000. You can see we have an in camera audio file as well.
So what was I aware of that my friends weren't?
Batch import of Cinema DNG is broken in Premiere. Adobe claims it's fully supported and I have to wonder if they mean that in the sense that duct tape can support your car's bumper to the frame.. If you go into the clip folder and have the eyeball set to the proper format you're fine, but trying to import multiple clips (i.e. folders, as we saw above) at once makes it take forever and then still isn't quite right.
Here we are looking at the contents of "TV_1_2016-06-22_2130_C000000" just as we were in the finder. If your eyeball is set properly Premiere interprets everything properly and you can drag that clip to a bin no problem
All is not well though. If we look at the parent folder bmpc raw we can no longer set our eyeball and Premiere can't see the folders for what they really are.
As I just illustrated Premiere is capable of seeing these folders as clips if you are only looking at one at a time, but try and look at a folder containing multiple clips and it plays dumb. I suppose this is what people mean when they say "can't see the forest for the trees". Trying to import one or any number of these folders will give you a major headache as my friend learned the hard way. Here's a video from a Creative Cow Post illustrating how you can import clips one at a time, but not in bulk. I can only assume the uploader had to mute the audio due to many numerous expletives uttered.
What that video doesn't show is what happens if you let that import finish. I thought I'd try using the clip I've been showing in my screenshots so far. This is the result.
Do not want.
First the good news: all of those clips play back fine with sync sound and are exactly the same. The bad news: you also get a containing bin with the clip name and the standalone .wav audio (not pictured). It seems each frame is interpreted as an entire clip you're left with a redundant bin and .wav file plus a number of duplicate clips equal to the number of total frames in your clip. For this 13 second clip at 24 fps that's 312 in.
Which is probably why it takes so long to import. If you watch activity monitor you can see the import process is only single threaded and it has to read every DNG in sequence 312 times over! If you'll recall the clip should only take 1.6 seconds to import properly. Using this method though, on a New Mac Pro with 32 GB of RAM, dual D700 GPU's, connected via Fibre to our SAN storage capable of 1000 MB/s read/write it took 1 min and 45 seconds... to import our 13 second clip which is 661.6 MB on disk. Which is actually faster than the math comes out to but we'll chalk that up to read caching on the volume. Frankly with the amount of footage from the feature I'm surprised my friend's machine held out for 30 hours, after which it was "slightly more than halfway" done.
To make matters worse, once if the import finishes Premiere still sees these as unique clips and will instantly start generating media cache info such as peak files, thumbnails, and the like for each. Not only is this a waste of storage space, it all but ensures you'll be unable to hit save before your computer curses you with it's last dying beep.
This workflow is broken by design in Premiere. Seemingly the only option is to import clips one at a time, as shown in the above video. Which is fine if you've got a handful of clips, but we're talking about a feature length film here with about 1/3 of the shot footage in Cinema DNG format. How do we avoid being a washing machine here?
To be honest, although I was aware of the problem I didn't know of a better way than what I mentioned above. In the few cases where this came up, the amount of footage was small enough that importing each clip individually wasn't the end of the world. After I my "I told you so" out of the way, I set about finding a better solution to help my friend.
Generally my first idea whenever something impedes my workflow is to check if Blackmagic's software can... *ahem* Resolve the issue. (Can we please make Resolve a verb like Photoshop? "Did you Resolve those proxies yet?")
Resolve is a fantastically designed piece of software that is constantly bettering itself. It does so much more than color grading and that there is a free version with a large amount of the full feature set makes it a viable tool for everyone. Maybe I'll do a toolbox post on it one day as I use it for numerous things.
I know for a fact Resolve handles CinemaDNG's flawlessly as I've graded many programs in the format. Also the footage is coming from a camera by the same company so I wondered if it could pull out any tricks that might help.
A view of the same bmpc raw folder in Resolve. This is more like it. Those subfolder to bin options? Be still my beating heart.
So getting things into resolve with bins is a no brainer. But how to get that info to Premiere? And let's assume we don't want to transcode everything to prores, because well, my friend didn't. Let's also assume we can't convince my friend to just edit the film in Resolve because it's really none of my business what NLE he uses.
Creating a timeline string out of a bin in Resolve is a trivial process and we could then export an xml for each, importing into matched bins in Premiere. If we had enough metadata to go by we could simply make a timeline of all the footage in Resolve and export an xml, and then sort things out to bins using the search functions in Premiere. In this case, there was no shot/take/reel metadata recorded and so the first option would be required. It's not what I'd call a great solution, but it has the benefit of requiring no expense and is certainly a step up from Premiere's option.
I did a quick test and noticed when I brought in the xml to Premiere it couldn't link up to the files. After comparing an xml of a CinemaDNG clip from Premiere to one from Resolve I quickly spotted why.
Premiere only identifies the first frame in the image sequence whereas Resolve uses notation to precisely indicate the range of files that makeup the clip. It wouldn't be difficult to use whatever utility you are comfortable to do a find and replace operation on the Resolve XML to match it to Premiere's desired format. Trying exactly that I can report it works as expected. This workflow was already a bit more involved than I'd like and now you're talking about running through a text-editing utility that may cost money or require specialized knowledge to operate. It could work, but it still feels a bit too wishy washy to me.
I then found this excellent post on Digital Bolex with a really clever workaround that harnesses the power of Spotlight on the Mac. Most relevant bit below:
Navigate to the date folder where all of your DNG files are stored. (If you have multiple dates, you will want to search within the DCIM folder.)
In the Search Bar at the top of the dialog, search for "0001.dng"—this will find the first frame from each shot of your footage, and it's importing the first frame which helps Premiere recognize that you are trying to load an image sequence. Using the search bar to find the first frames is the preferable and recommended way right now to important CinemaDNG footage into Premiere.
I tested for myself and have to say it's quite a brilliant solution considering how simple it is. Depending on the folder you search you can either bring in all your clips at once, or in groups by their folder structure for placement directly into corresponding bins.
P.S. Windows users I can't help you but this thread might. I have not tested it's claims in any way but it seems to operate on the same principle as the Spotlight trick.
Sure enough, trying this trick to reconnect the prores files to the r3d worked.
Note: File Name and File Extension must be unchecked otherwise the .mov - .r3d mismatch will through things off, in it's place I turned on Media Start and Tape Name to give it something to match with. Also make sure Use Media Browser to locate files is unchecked or you won't get the ability to use spotlight.
Here is what the Finder import window looked like using the Spotlight trick
Exposing the first piece of each spanned R3D clip in the RDM directory.
In the above screenshots we're looking to reconnect to the original for A006_C001_0112DV.mov proxy. So after we selected that first R3D file in the list, everything else in the RDM directy came online perfectly. So much win.
Except, I had to cheat a bit to get you the above picture. I copied some of the R3D footage to a local drive and ran it from there because our editors work off an Xsan on which we've disabled Spotlight because of numerous reasons and errors. It was causing massive instability and affecting the volume performance. So this amazing trick doesn't work.
That's ok though because we can use the same basic principle, displaying the first file of each clip together in one "virtual" folder to Premiere. We can do this by creating a folder with symbolic links.
To do that we can right a simple bash script like this:
!#/bin/bash mkdir ./links for clip in */*001.R3D do ln -s "$PWD"/"$clip" ./links done
The above example would be run from the RDM folder containing the individual RDC folders, but with a little tweaking you can make it process down as many subdirectories as you'd like and make a link folder for each one.
Here's the result of running our footagelinker script. The little arrows on the icons indicate the file is a link, pointing to an original elsewhere. You can see it's displaying the original media in the preview pane, even though the file itself is essentially empty.
To reconnect in Premiere we would then start off the same way as we would have with the Spotlight trick, checking/unchecking the appropriate boxes in the relink dialogue. The difference is that we then could reconnect normally using the media browser to point at the appropriate link for the first offline file, and the rest will reconnect accordingly.
This same process could be used for the original Cinema DNG subject matter of this post instead of the Spotlight method. You would just need to change the line
for clip in */*001.R3D
for clip in */*000000.dng
Or in the original case of my friend, with one master folder containing subfolders for each scene's worth of clips you can tweak the processing just a bit.
!#/bin/bash for clip in /*000000.dng do folder=$(dirname "$clip" | rev | cut -d / -f 2 | rev) test "$folder"/links -e || mkdir "$folder"/links ln -s "$PWD"/"$clip" "$folder"/links done
This creates a links folder inside of each scene folder with all the clips from that scene. Either way you can simply drag and drop the links folder(s) into Premiere and it will properly and quickly import the original footage.
If we ever need to edit Cinema DNG in Premiere off our Xsan this is definitely how we will do it.
Blackmagic has a product called the Hyperdeck Studio which is essentially a tape deck that uses 2.5" SSD's instead of tapes. It even supports deck control via RS-422. They also make the Teranex, a hardware conversion box. So it should be simple to use these two products together for standards conversions right? Load your master(s) onto an SSD, pop it in, patch through the Teranex to an edit station with capture, and batch import your neatly converted media.
Well there are some weird quicks, but with a little help from FFmpeg and an AJA capture interface it all works out.
We've recently been tasked with providing our masters of spots for foreign markets. After learning of these products and having a chat with a BM rep at one of their showcase events I was told it would serve the purpose. We will be doing these conversions regularly and frequently so a quick, reliable, hassle free workflow was the aim.
The Teranex does great conversions but only functions as an I/O box when hooked up via Thunderbolt. To properly convert it needs to sit in the middle of your playback and record signal chain. You can do station to station but there's no real way to synchronize playback/record and maintaining timecode requires some finagling of the recording in your NLE after record, trimming and then exporting.
Using the Hyperdeck Studio gives us deck control and the process becomes more straightforward, much more like tape to tape dub.
1. Files with stereo mixes played fine, but anything else (like a 5.1 mix) would have no audio. As explained by Kduran818
The only way so far, that I was able to get multi-track or 5.1 audio to come out of the Hyperdeck was creating a 16 channel multitrack file at 48khz at 24bit (it will not work if you have discrete tracks)
He also claims 16 bit audio won't work at all but I've yet to test this as almost all our material is 24 bit. I read somewhere else that this is a bug, and sure seems like it but I don't have confirmation at the moment.
2. In this case we were taking a 23.976 spot and converting to 25. The Teranex can be set to take the embedded timecode coming off the Hyperdeck and regen it at 25 and pass that on to Premiere, but since Premiere is deck controlling the Hyperdeck it was working in 23.976 even though it was capturing 25 to disk. I think you can guess how that worked somewhere between poorly and not at all.
1. You can add empty audio to pad out those 16 channels any way you want, but I'm sure whatever workflow you're running through in your mind right now isn't elegant, quick, or something you'd want to do every time you needed to do a standards conversion on a spot. I wanted something our AV guys could utilize as a droplet. It's fitting my last post was about FFmpeg as a tool, because weird scenarios like this is where it shines.
Assuming you have 5.1 audio properly mapped within a single track/stream
Will make you a file that plays back audio on the Hyperdeck. It looks wonky, especially the part where we take 5 channels of null input and double to get 10, which we append to our 6 channels of surround, but FFmpeg gets picky about odd channel configs. Try to make 10 null channels and watch FFmpeg freak out if you don't believe me. I've tested this numerous times though and as long as your input has 5.1 audio in the standard order in a single stream, this oneliner works great. Timecode is maintained, as it the levels and order of our surround mix. We are essentially just throwing silence onto 10 channels after the 5.1 layout. Since we're not re-encoding our video and we're only adding silence the command's speed is really only limited by how long your disks take to create a copy of the file. In our case it's about 15 seconds for a 30 second spot.
For now I've just popped this into a shell script that takes input of a filename and output name, but I'll probably used previously mentioned Pashua to make a GUI with drag and drop to be a bit more friendly for our AV guys. If I want to get fancy I can even add a line to copy the file to a default volume name (for our SSD in toaster style dock) so when it's done the SSD is ready to eject.
2. The key to solving the deck being in 23.976 and the timecode coming off the Teranex being 25 was to use an AJA capture interface instead of a Blackmagic one. In Premiere the AJA driver allows you to set the timebase of your deck (see screenshot below) rather than simply listening to what the serial connection is telling it.
Changing the timebase to match our converted fps was the key.
Our BM Ultrastudio 4K had no such option. This wouldn't matter if we weren't changing the framerate, but more often than not that's what we need to convert with the Teranex. Also worth noting real quick the Teranex has a processing delay of 2 frames, and our capture hardware was 8. So we ended up going with a Timecode offset of 10 frames to give us frame accurate conversion. Your mileage may will very.
An alternate solution could be to playback from the edit station through the Teranex to the Hyperdeck in record mode. It has the ability to start recording on being fed timecode and stop when timecode stops. I tested this at first with the Teranex set it input regen mode. However in that mode the Teranex remains parked on HH:MM:SS until playback starts but constantly runs the frame count even while paused, which the Hyperdeck sees as active timecode and constantly records. It should be possible to set the Teranex to Jam Sync at program start/01:00:00:00 and playback from the edit station with a leader so that the Teranex withholds all timecode from the Hyperdeck until program start and then pass on free generated timecode which should properly trigger the record start/stop. Furthermore setting up Premiere to playout 16 channels of audio could negate the need for FFmpeg to be involved. This idea didn't come to me till later, and I haven't tested it yet, as we're happy enough to have a reliable solution for now. I'll update this post with my findings if I do get around to it.
Also, I haven't been able to get the Hyperdeck Studio to play back any SD files yet. It doesn't even claim to see the files on the SSD. I may give support a call as I haven't been able to find anything online, and I've tried numerous files all well within the supported specs. It would also be good to see if this whole 16 channel audio thing has been acknowledge as an issue, and if there's any plan for a fix.
Playing a spot with 5.1 audio out of a workstation via the AJA control room software into the Hyperdeck does record the audio properly, sort of. The 5.1 channels stay in the correct order and are unaffected but the Hypderdeck records a full 16 channels in one stream, thus messing up the mapping. So the resulting file would need to have the empty channels pruned and the original audio remapped, either via FFmpeg or your NLE tool of choice.
Furthermore, running through the Teranex and jam syncing the timecode (on a framerate conversion) works to start triggering the recording, but once it starts generating timecode it will go on forever unless you tell it to stop. So you'll manually have to stop the Hyperdeck at the end of the recording which makes this workflow not a true batch/automated solution. You'll also have to hit "Start" in the timecode menu on the Teranex after each spot to stop the timecode (which does stop timecode output and thus Hyperdeck recording) and resets it's search for the jam sync value on the next spot. If you weren't doing a framerate conversion setting the Teranex to passthrough timecode triggers start and stop on the Hyperdeck just fine. There is a SDI Start/Stop trigger on the hyperdeck but that doesn't seem to work as I imagine the output card or Teranex aren't passing along that trigger.
For these reasons I believe going out of the Hyperdeck into an edit station is preferrable, only 1 FFmpeg command to run (a quick one at that) and true accurate automated capture.
This is an attempt at an idea I had to document the tools, software or otherwise, that I find indispensable. Tools can be used in many different ways and for different reasons so I'll try and keep these sort of abstract but provide enough links to explore further if you're so inclined.
It's not always the best solution for every situation, but I'll be darned if it can't handle just about ANY situation. To list it's abilities and format support is beyond the scope of this post but suffice it to say they are both... extensive. It's also free, open source, and runs on just about everything. Some things I use FFmpeg regularly for: container swaps, trimming or extracting compressed video/audio without needing to re-encode, processing image sequences, auto cropping material to remove baked in padding, adding fade ins/outs to media, and so on. There's also a component FFprobe which is a great way of examining the stream properties of files and their metadata, and FFplay which is a playback app that can run the same parameters as the encoder, so you can preview things such as aspect width adjustments or other filters before encoding them.
If you're familiar with scripting you can also use FFmpeg to do some really neat stuff. Batch processing folders, compiling metadata on a group of media, playing two versions of a file back side by side in sync for comparison, it even has a streaming media sever component (ffserver). If you want to get fancy you can even use something like Pashua to make a little GUI for your FFmpeg recipes, essentially creating droplets for different scenarios.
If you're on a Mac and looking to get FFmpeg your two best options are:
Someone, who may or may not be the real MVP, offers a precompiled mac binary that you can quickly download to give FFmpeg powers to any machine you happen to be on. OR Install it through Homebrew, which gives you the ability to build with the particular options you might want like Ogg Vorbis support, and is much easier to keep up to date.
(Windows precompiled here, hough I haven't tested these personally. If you're on Linux you don't need me telling you what to do.)
In terms of figuring out syntax and functionality, the best resource for me has been the official documentation which seems to have a little more in depth info/examples than the man pages or filter help strings. It definitely has a steep learning curve at first. It helps to have a specific goal in mind while trying things out, otherwise it can be a bit overwhelming.
On that note, I recently became aware of a project called iFFmpeg which is GUI wrapper that asks you to BYOB (bring your own binary). I've tested with both the precompiled that I've linked above and my version from homebrew and both seem to work fine. The app is 20 bucks and is just a fancy set of clothes that doesn't offer any additional functionality to the free command line app. That being said, it is pretty slick and can make things a lot easier if you're not a command line user.
It also has the potential to be a great learning tool as you can setup all your parameters and when you hit OK it actually outputs the full command line syntax of your choices (since it's just interacting with the FFmpeg binary) which can be a great way to jumpstart your understanding of what flags/switches do what. There's nothing wrong with using this app as is, but you lose out on some of the more advanced capabilities that come about as integrating FFmpeg with your own scripts/apps.
This is pretty specific, but it's also pretty annoying and I'm sure I can't be the only one this applies to.
We have images with all the preinstalled software and settings that we roll out to our edit machines as needed for upgrades and what have you. These images once rolled out still need to have certain software serialized as each seat needs a unique serials. Also of note here is that our editors log in with network accounts to an open directory server.
I haven't really tested this extensively but I've seen this issue across OS X 10.8-10.9 and in the Trapcode Suite, Effects Suite, Color Suite... my non thorough conclusion is any plugin that uses Red Giant's serialization method will be affected on any OS X machine logging in with a network account.
Entering a serial while logged in with an Open Directory account doesn't really work, even though there's no real indication anything is wrong, aside from the red X still making your editor feel they are somehow undeserving of a full license. Was it something they said?
I've got no real caption for this. It's pretty straightforward stuff.
Log into a local admin account on the machine and serialize there. It will do the trick and remove the watermark.
Judging by Red Giant's FAQ, the issue stems from the fact that /Library/System belongs to root/wheel. They give you some troubleshooting steps to change permissions on that but I don't really feel like messing with system permissions if I can avoid it and it takes longer than using a local account for one time registration.
I don't even remember when I discovered this, it's been so long it's second nature by now. I do recall being puzzled when I first encountered it though, particularly since we haven't had any problems serializing any of the other multitude of plugins we use through Admin OD accounts.
This one is deceptively simple. Yet could screw you over big time if you're not paying attention.
You're all set to re-image a machine with some fancy new software. You reboot and are at the NetRestore screen. Suddenly you get an email.
Hey can you hold off imaging XXxxXXX's machine for another day? Despite leading you to believe the contrary they have in fact not backed up all their work. Thanks,Boss Person
OK, instead of clicking "Continue" let's click "Go Back".
Clicking "Go Back" still starts the restore process. I was so incredulous that I was a gibbon and clicked the wrong button I set up a dummy test machine and filmed the process again.
Pressing "Continue" still performs the restore, in case you were wondering. This was also filmed with a potato, in case you were really wondering.
Don't click "Go Back". Pull the plug if you change your mind about a restore. Just don't click "Go Back" unless you actually want to click "Continue' but still want to achieve a sense of non-conformity.
I can't recall backing of out of a restore like this any other time, as I always ask and re-ask people if all their stuff is in order first. That being said, this particular occurrence happened (and is reproducible) with our iMac NetRestore serving running 10.9.5, with Server 3.2.2.
Also, if you let the restore initiated by the "Go Back" button complete it reboots to the same NetRestore screen with no indication that the restore was successful (I'm sure it gave some message while I wasn't sitting at the screen before it rebooted though). Furthermore the "Continue" button doesn't exist at that point, even if you select a volume to restore to. Upon setting the startup disk to the local drive and rebooting I confirmed that the image was in fact successfully restored.
NOT PICTURED: Screenshot of the above paragraph,
because my phone was in my other pants at the time.
Some quick searching has been unable to pull up any other record of this happening, so I submitted a report with the details to Apple. I'll update this post if I hear back.
I know I tend to post some weird broken stuff on here, but this one is pretty out there. Please tell me if you've seen it as well so I can feel a little less crazy.
CC 2014.1 screws up playback with AJA devices in a really irritating way. It's stuff like this that really makes we wish Adobe offered some sort of stable release of CC for post houses rather than constantly pushing out updates that fix some things while causing others to break.
We had a RED Dragon project coming in that was going to be extremely After Effects heavy and quickly discovered this fun fact, which required an update to Adobe CC 2014.1 to fix.
The After Effects team are investigating a bug in After Effects CC 2104 (13.0) where RED (.R3D) files are very slow and the image resolution is poor, about 1/8 sampling.
With CC 2014.1 you'll have audio glitches, cutouts, and a lot of instability unless you update your AJA Adobe plugin to version 10.6 which adds official support for CC 2014.1.
AJA has finally introduced an update that has their own home-brew version of delay controls. It is less precise (frame based vs millisecond offset) but it works well enough and finally puts this issue to bed. Their release notes shed a little more light.
Added Premiere Pro desktop frame delay for syncing the desktop monitor with the output monitor in instances where the desktop monitor or other electronics downstream from the AJA device are introducing delay. NOTE: This is not the millisecond delay provided by Adobe, but can be found in Preferences>Playback>Video Device>AJA Device>Setup>Output Offset. In a future version of Premiere Pro, the Adobe provided millisecond offset will work. The interim AJA provided offset has two limitations: Use of the offset will cause the audio and video to be out of sync. Audio can then be adjusted using the Adobe provided millisecond delay features in Preferences>Playback>Audio Device Use of this video offset can cause a missed frame when laying off to tape. Be sure to set this offset to zero before layoff.
I'm not even going to point out that "electronics downstream" weren't the problem as there is zero delay when using the same hardware with other software on the machine. Or that it using this doesn't throw the audio out of sync, it in fact fixes the sync problem. Using a frame offset value of 3 frames gets you close enough to work with the IO XT and 4K. The reality is that it's between 3 and 4 but sub frame is enough for now. AJA hardware is finally usable in Premiere again. It only took four and half months!
We have AJA Io XT's and Io 4K's for playback to our broadcast monitors. Out of Premiere there has always been a slight delay; e.g. a cut will happen on the computer monitor and will then happen a short while later on the broadcast monitor. It was just enough to make the editors feel uneasy, and make you question what was in sync to the audio when cutting something to music.
When we first set all this up I wondered if it wasn't some processing in the broadcast monitor or elsewise. I tested with FCP X and AJA's own control room software and experienced no perceptible delay at all. This made sense in a way as Premiere, unlike those two, relies on an AJA provided plugin in addition to the hardware driver for the device. (Avid does as well, but we are not an Avid house so I'm not even going there.) So the delay must be stemming from the communication between Adobe's code and AJA's driver via the plugin.
Thankfully is was easily enough solved through a little trial and error of some bars/tone and 2 pops to dial in a offset for the device, which is a feature built into Premiere.
Premiere Pro > Preferences > Playback
Unfortunately while adding support for CC 2014.1 the updated AJA Adobe plugin removed support for the offset functionality. Well, the functionality is part of Premiere, but from the release notes on the AJA plugin.
Millisecond delay controls will not function for video device in Premiere Pro Playback Preferences.
Starting with the AJA Adobe Plugin version 10.5.2 they implemented a new buffering system.
New Hardware buffering setting in the Transmit plug-in. • Standard – This mode uses an 8 frame buffer and is recommended for use when mastering tapes. This is the default mode. • Minimum – This mode uses a 3 frame buffer. Using this mode will increase the likelihood of dropped frames on the output when system resources are not keeping up, but can improve the stop/start and JKL performance on the timeline while editing. • Maximum – This mode uses a 14 frame buffer and should be used any time occasional frame drops are occurring and limited system resources are the suspected culprit. For example, with an underpowered, older Mac Mini, or a system without a Mercury Approved GPU, this setting may help mitigate minor frame drops.
Clicking Setup next to Io XT in the above screenshot gets you here. Choose your amount of frustration maximum, standard, or minimum.
So, I'll give you a few frames for that to sync in... *ZING!*
No matter what we are going to have our program monitor in Premiere be at least 3 frames out of sync from our broadcast monitor. And there's no real way to fix it.
Surely a large company like AJA would scramble to fix support for such a large NLE platform if they knew this was a thing right?
I was just informed about the Slide/Delay , and that it is a known issue, AJA and Adobe are working on … I have entered you into the database with case ID# D5232 for notification when the next release will be with a fiX.
Sorry I don’t have better news!
AJA Tech Support
That email was from November 25th, 2014. I followed up with them on January 13th, 2015 and received the following.
The Slide/Delay issue has not been resolved as of yet… you are on the notification list and will get an email as soon as the bug is fixed!
Sorry that we do not have an ETA !!
AJA Tech Support
Other than increasing the amount of exclamation marks in their apology, it looks like they haven't done much.
If anyone has anything to add on this one, I'd love to hear it. In the meantime I think we're SOL.
Nothing to do but deal with it. You don't really need to use that broadcast monitor anyways right?
In case you've asked why we didn't roll back to 2014.0 it's because 2014.1 projects aren't backwards compatible. There is no easy out here.
With it's Mercury playback engine Adobe likes to push the idea of a fully online/native workflow fairy hard. For the most part, if you've got the hardware, it actually works pretty well. I recently discovered the hard way that, in certain circumstances,deviating from this model could mean you're going to be in for some chop. This is all in Premiere CC 2014.1 so if Adobe ever fixes this you don't read this and think it's still broken. I do hope they fix this. UPDATES
3/20/15 - See the Further Thoughts section of this post for a solution! THE CONTEXT
We were expecting a drive full of Red footage. It was decided by those who decide such things that editing needed to start yesterday. So I'm told (not asked) that we'll be receiving H.264 proxies of all the footage via the web to start the edit with, and can online to the r3d files when we receive the drives. The proxies had matching timecode and filenames. So what's the problem?
Premiere Doesn't Reconnect to r3d files properly. I wish I was wrong about this, but unfortunately I'm not.
So the media browser in Premiere generally autodetects the type of media you're looking at and adjusts appropriately. For instance when opening up a dump of a Red card it sees .RDC folders as clips with scrubbable thumbnails...
A day at the beach this project was not.
...except when you access it via the reconnect media dialogue. It remains resolutely stupid, and if you don't know any better you'll be looking at a bunch of folders wondering how this is gonna work.
The same folder viewed through the media browser while re-linking assets.
You have to manually switch the browsing mode to RED via the eyeball. This is inconsequential to the bigger issue here, but it really bothers me nonetheless.
You clearly know this is RED footage. Are you just messing with me now?
Either way you can relink to the clip in RED eyeball mode, or if browsing as directory you can go inside and relink to the clipname_001.r3d. Doing this works and brings the clip online as expected. The catch is neither of these methods work if you're trying to batch reconnect multiple clips. Premiere will just hang indefinitely, and that will be that. (Trust me, I've waited hours while trying to reconnect 3 sub 1 minute clips.)
Don't plan to do an offline-online workflow of Red footage in Premiere?
It seems absurd such a prominent NLE could fail at such a simple task. I'm not the only one to think so. This is what has been suggested as a workaround, or this
Personally I don't feel great about modifying the directory structure of master footage, particularly with spanned clips and the metadata that gets associated with Red stuff. Nor do I feel great about cracking open my project file and editing it. While both these options should technically no cause any trouble, it's a bit too risky for me.
The proper solution would be to bounce an XMl of the edit out to whatever software you're finishing in and bring things online there. I tested this with Resolve and Redcine-X, both worked as expected.
There are a number of reasons why waiting till picture lock and on-lining in another program may be an issue. In our situation we needed access to the r3d source controls to balance color and exposure between shots, because the footage was shot so mismatched the producer refused to approve client review of the rough cut. The H.264 proxies were also quarter res, which didn't look to great on our 4K client monitor.
I'm not going to detail how we went about bringing it online because at that point things depend on so many of the project's factors such as audio source/organization, if there's footage from other cameras, how far along in the edit things are, etc. There's no right answer because this functionally is simply broken in Premiere so it's gonna be messy regardless of how you do it. Most likely the answer will involve several XML exchanges between software that actually works. Or if you're brave, moving all your source files around as outlined in the workaround I linked to above. What worked in this case, may not necessarily work in others due to previously mentioned variables.
I can't suggest strongly enough that you don't try to edit an offline Red project in Premiere, unless you are fine with keeping it offline until you get it out of Premiere. Or relinking every clip by hand.
---“The House on Elwell” / 2011 / 3 min / Watch Now
---"Midway Salvage" / 2014 / 20 min / Video Installation / Co-Editor With Amy O’Neill / Stills
---"Ice Mine" / 2013 / 2 min / Video Installation / Co-Editor With Amy O’Neill / Stills
---"Holy Land" / 2011 / 6 min / Video Installation / Co-Editor With Amy O’Neill / Stills
---“Joe’s Bar” / 2011 / 9 min / Video Installation / Co-Editor With Amy O’Neill / Stills
---“Forest Park Forest Zoo” / 2010 / 6 min / Video Installation / Co-Editor With Amy O’Neill / Stills / NY Times
---“Ballroom Dancing/Mortal Kombat” / 2010 / 3 min / Video Installation / Assistant to Amy O’Neill / Stills
---“How to Live Like Gwyneth Paltrow” / 2011 / Watch Now
---“How to Pack The Essential Winter Hiking Gear” / 2011 / Watch Now
---“How to Do The Shim Shell Quarter Trick” / 2011 / Watch Now
---“How to Make Green Eyes Pop” / 2010 / Watch Now
---"That night he dreamt about seagulls" / 2012 / Watch Now
Motion Pictures and Film | Greater New York City Area, US
Post Production tech with strong background in color correction and editing.
2013 - Present
Technical Coordinator / BBDO
Managing hardware and software services for in house post production facility.
Editor / Amy O'Neill
Editing and color for multimedia artist exhibiting across the world.
Tech consultant for all gallery showings and exhibitions.
Editor and Colorist / Freelance
Editing and Color services on a project by project basis for corporations and independent projects.
Manager & Sales Associate / Mikey's Hookup
Answering technical questions and providing solutions for customers at a Pro Audio/Video Computer Store that contained an Apple Authorized Repair center. Also had manager responsibilities such as opening/closing, counting the till, and repair check in/check out.
Post-Production Contractor / Appresentative
Corporate video editing for website redesign
Post-Production Contractor / Expressions Media
Editing and motion graphics for multicam live event videography company.
Post-Production Contractor / Howcast Media
Video production and editing for most watched how-to video corporation
Creative - Software Instructor / Apple
Taught one on one classes and group workshops on Apple software, mostly Final Cut Studio. Also assisted with mobile device repair and diagnostics at the Genius Bar.
School of Visual Arts
Film & Video (Concentration in Editing/Post-Production)