Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


Everything posted by jan1

  1. So my first test of this has produced some results, but no success. I was able to render out segments, and produce an EDL that referenced back to the segments, not the original clips. However: - The initial attempt to render segments with DNxHD failed with an error message (stating something about the codec didn't support this - more on this later). It worked once I switched to ProRes HQ. - Segments where the source file didn't have enough material for handles resulted in a freeze frame filling the gap. Which may be ok, but caught me by surprise. Maybe a black frame instead would be better so it doesn't get missed in QC if there's an issue. Freeze frames are harder to spot. - The resulting EDL once imported into Premiere links back to the segment clips, but ignores the offset for the handles. Each clip seems to start with the handle, not the in point. - The resulting segments were rendered in the source clip native frame rate, rather than the project frame rate. So the new EDL in Premiere is a hodge pod of 23.97, 29.97, 24, etc. In fact I believe the failure of the initial DNxHD render was that it doesn't support some of the frame rates (24.00). I haven't debugged everything in detail, but I'm worried that this mixed frame rate also leads to additional confusion in the resulting timeline.
  2. Thanks! As always there's a way... I didn't realize the EDL would with these settings use the render files Mistika made. I will test this workflow. Maybe we can add a feature request for an option that when selected extends a clip in the visual editor to include a specified number of handles on each side. That would allow the completion of any tracking into the handles.
  3. Couldn't quite decode this from the UI and doc, so hoping there are some community answers ? I'm working on project where we are round-tripping from Premiere. I get camera source files and an XML to conform. I then need to render out individual clips with handles that can be put back into Premiere. I've seen that Mistika can render out individual clips (segments) and can attach handles. But can it also generate a new XML for those for the re-conform in Premiere or elsewhere? When rendering out individual clips the obvious issue is that there may be multiple timeline instances of the same source file. So keeping the original source filename causes conflicts, the clips have to be renamed by segment and thus a new XML is needed to re-conform. Resolve does this via their built-in Premiere XML export preset. And as an aside, if you render out clips with handles, how can you in Mistika make sure any tracking and keying is extended into the handles? Is there a way to see the clip with handles in the Visual editor while working? Thanks, Jan
  4. Yes, it is very ergonomic. It allows my wrist to rest on a small pad (similar how you rest your wrist with a trackball) while you have full control over the user interface. You can then switch at moments notice to the pen if you need fine control while using the same desk area. Using the pen all the time is harder on your wrist, plus you have to keep setting down the pen if you type, while with the touch option it's a seamless hand movement between pointing and typing.
  5. Interesting. Very typical of my experience with Mistika, there are all these hidden features that are very powerful. With that said, there is something to the A/C mode (sort of storyboard), because it keeps me in the flow. Once I have some initial grading done, I like to just progress through the story board and copy grade to other clips and make adjustments. Having to duplicate and run macros forces to switch back and forth between what's going on in the timeline and what I'm doing in the visual editor. Once I'm grading it's very nice to just stay in the visual editor as context and go back and forth. Keeping in mind that I keep my UI configured to a single monitor (I do have two but due to their arrangements I find it easier to use just one). So switching between the timeline and the visual editor is more of context switch for me. I think this is an area where Mistika will find as you go more to Boutique and a more diverse hardware setup, that you cannot make assumptions on how people work, and you see them preferring different ways than would be natural in the traditional setup.
  6. Love this. As I said, there always seems to be a way of doing it in Mistika. Though in this case a sort function on the Storyboard would be a lot more efficient. So hopefully we can get that added to the feature list.
  7. Apologies for the rash of topics/questions. In the middle of new project and still filling in some knowledge gaps with Misitika. On this project I have to do some skin cleanup with the clone brush. Nothing major and not worth going to another app with. Everything works great using the Vector Paint node, making a quick track, then applying the clone brush, selecting each stroke and copying the track (still wish I could apply the same track to multiple shapes). But the issue I ran into this time is that on two longer clips the face turns sideways, so I really only need to apply the clone brush to segments of the clip. In the tracker window I can select a range of frame numbers to constrain the tracker, which worked. But there doesn't seem to be a corresponding feature in the paint editor. It's either one frame or all frames. I can't just duplicate a shape into a specific frame range. Silhouette allows you to specify a work area with start/stop frame of the clip to segment operations. I ended up working around it by dividing the paint vector into three clips, deleting the middle one and then doing the process for the left/right segment as usual. That worked, but would be nice if there's an easier solution. Maybe there is and I haven't learned it yet? Jan
  8. I was wondering if there is an equivalent to Resolve's A/C mode and/or Avid's source/target in Symphony. Essentially a way of speeding up grading of multiple clips from the same source media file. We often see this in dialog scenes and or tutorials where there is a lot of cutting back and forth between two camera angles resulting in multiple clips from the same source file. One way of doing it in Mistika is to go to the Storyboard and create gangs for each camera angle, then solo that gang and propagate the color grade. That works fine, but it's a time consuming to setup the gangs. I think the equivalent to A/C mode would be if the story board had a sort feature. Either sort by timeline order, or sort by reel/media and TC. The latter which would auto-group everything from the same clip next to each other for quick operations. Another option would be to have one more propagate scope button, which 'current media file'. Avid's implementation is more versatile but would require a more complex UI change. Maybe there already is a solution and I haven't come across it? There always is a solution in Mistika, isn't there ? Jan
  9. Hi Cristobal, Thanks for checking it out. Correct, once I pressed really hard I could get results. I just didn't realize that this was the issue, which it why it took a while. One possibility would be to have a checkbox that determines whether/how pressure is factored in, which could be off by default. In Photoshop the default setting is that pressure determines brush size rather than opacity, but there are a few buttons at the top that allow to select your desired behavior. So having a checkbox could allow people to opt into pressure control rather than being thrown off by it. But if you haven't heard anyone else complain, it may be a corner case and not a high priority at all. PS: My tablet is an Intuos Pro with latest drivers. But I also use it as a touch pad instead of a mouse, so I only rely on the pen when painting or drawing shapes. For most interactions I just use finger touch and gestures. I had made my pen pressure harder to avoid false positives when drawing shapes which is what I use the pen a lot. Thanks as always, Jan
  10. Not sure what the correct answer is, but something to mull over... Yesterday I was working some quick cleanup on a clip, and ended up banging my head against the proverbial wall for 30min. I couldn't get the clone brush (or the paint brush) to do anything I wanted. The strokes showed up when selected, but there were no results. I could add shapes without issue. I was almost suspecting there was a bug maybe in the latest version, went back to tutorials to check that I wasn't missing anything. In the end it turns out I had recently changed the tip sensitivity of my pen to being a bit firmer. Between that and the standard pressure curve in the Mistika paint tool the brush was essentially near 0% even though other brush opacity was maxed out at 100%. Not until I re-adjusted the pen sensitivity in the Wacom settings and then added a much more aggressive response curve in Mistika did it start doing what I expected. I wonder if there should be some sort of feedback (either a readout of the effective pressure opacity, or a warning if the opacity is below a certain threshold) to avoid other people's frustration so it's more obvious why the brush isn't doing anything. And double checking that the default curve is appropriate with the full range of sensitivity settings of the Wacom tablet.
  11. Conforms from an NLE are always a mixed bag. We wish they would be painless and 100% accurate, but they rarely are. One interesting technique that exists in Resolve is called Pre-Conformed EDLs. You export the timeline from the NLE as a render, and you export an EDL of the timeline. In Resolve it's imported as a pre-conformed EDL (separate function) and instead of linking to the original media, it uses the single render from the NLE and replicates all the cuts as they existed. Of course the relationship to the original media is lost, but everything is in individual clips with cross fades, etc. and can be colored. Such a function doesn't exist in Mistika. But in experimenting with it, I may have found a way fo easily doing the same in Mistika. Import and load the timeline render as a single long clip. Then import the EDL and de-select the 'link to media'. It will create a set of unlinked clips in the timeline, mirroring the edit in the NLE. Now move that EDL import right above the long NLE render and run the 'split conform' macro, then delete the EDL import. Now the original import reflects the original timeline. It doesn't require the use of scene detect, and it can bring over transitions that are EDL compliant.
  12. Yes, you do have to keep the eval tree open and select appropriately. It's all logical once you wrap your head around it, and quite powerful. Many other apps have similar concepts for selecting visual scope vs. edit scope. Avid track panel, Silhouette's tree, etc. These are just the UI quirks of large applications that take some time.
  13. At 7:30 in the video he explains the key point. In the shapes interface you need to select 'keep' with the input alpha so you can combine the green screen alpha and the shape.
  14. Not at my system right now. This old tutorial has good detail on it:
  15. jan1

    32GB of RAM?

    Hi Cristobal, Adding to that question - I do have 64GB in my main system and 32GB in my mobile system - I have the sense that the proper utilization of all this memory depends a lot on the settings in mistika config. I've been following the various descriptions in the Performance Options tab, but not at all certain if those are set properly. For example Max Cache Memory is usually defaulted to 1GB. Would it make sense to increase this on a system with 64GB? Also the System CPU Cores and many of the other parameters further down default to 0. Based on the descriptions I've changed them to the number of cores, or 1/2 or 2/3rd of the number of cores. Does 0 mean Mistika will automatically pick the best value, is 0 the minimal value and will lead to non-optimal performance? If you had extra memory, is there anything we should do to optimize where it is applied based on the specifics of the project we're working on? Thanks, Jan
  16. jan1

    Save Grad as 3D LUT

    Very interesting. A bit of processed indeed, and not sure I would do it regularly, but I like the technique. Will come in handy some day.
  17. jan1

    Save Grad as 3D LUT

    All good. Just wanted to make sure I wasn't missing anything.
  18. jan1

    Save Grad as 3D LUT

    Hi @Yoav Raz, Thanks, interesting. I missed that when looking around. Not completely the same though. Export Primary LUT only exports a 1D LUT not a 3D LUT, and Export Primary to CCC exports an ASC CDL statement in XML format. Makes it easier to get a CDL though. Cheers, Jan
  19. jan1

    Save Grad as 3D LUT

    Found a partial answer - when exporting an EDL2 it has the option to export a CDL which would be a subset.
  20. Does Mistika have the option of saving a stack as a 3D LUT to be used as a look in other software? Workflow example: Import footage, grade to a look, export footage as EXR clip to go into other software, export 3D LUT of look, go to other software load EXR and LUT to do additional processing, save changes as EXR sequence, import EXR sequence back into Mistika. Use case: if more extensive skin work has to be done than is efficient with the Mistika paint node, I often taken EXR sequences to SilhouetteFX. I was just working on the details of how to bring ACES graded footage out of Resolve into Silhouette that way. But it relies on the ability to save a grade as a 3D LUT. Of course a LUT cannot reproduce all grade operations, particularly spatial and temporal changes. But it can encompass all primary, band, and fixed vectors as an example. I looked around and searched the manual and didn't find any reference that hinted on this ability to do this in Mistika. This is more out of curiosity at the moment, not a high priority.
  21. If you grouped your left/right clip so it's a single track, does that make a difference?
  22. To apply a transition, I usually lasso both clips (to left and right), click on the transition. It automatically places it on top, spanning the overlap, and including everything vertically. My license for Sapphire is not current for OFX, but if I apply an OFX transition, go to the first frame, turn on autokey, change a parameter, jumpt to the last frame, change the parameter again and then scrub, it properly interpolates. I can't see the effect since Sapphire just renders a black frame due to the expired license. So I think it should work, there may be just something wrong in your sequence of steps?
  23. Thanks. Happy to see how quickly you respond and improve things ?
  24. It was a client project and the color render is a 10GB file. But if you want I can file a ticket and provide a Dropbox link there. I trust your team will handle the footage with the appropriate discretion.
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.