jan1
Members-
Posts
166 -
Joined
-
Last visited
-
Days Won
32
Content Type
Profiles
Forums
Calendar
Everything posted by jan1
-
I just upgraded to the latest release of Mistika Workflow. I was keen to try out the H.264 transcoding, since my most common workflow outside of the main app is building the delivery files from the master renders. Still issues. I took a ProRes input with my ProRes 444 master and I took a 'file' input for my audio WAV master connected them to a H.264 output, with default settings. Ran the queue, errors out halfway with 'H.264: Media Write Error', no further details. It created a file that is playable, but is only the first 30s of a 5min video. I ends abruptly at 30s. The H.264 settings are very terse and difficult to read. For example Bitrate just says 10. I guess it maybe 10Mbps, and confirmed that the file was indeed encoded that way. But it would be good to label these better. Same with Codec 'H264BR' - what does BR stand for? Is this variable for constant bitrate encoding? Does it do a 1-pass, 2-pass, multi-pass encoding? Other than ProRes or DNx.. codecs, H264 has a lot of very important parameters that need to be set correctly. I don't think the current UI is usable for H.264 workflows. Also at least in my case the 'Audiocodec' defaulted to 'None', even though I had something connected to 'AudioIn', although that was connected as a second step from a different node. I think the validation rules should have warned that there was an audio input, but the codec was set to none. Easy to overlook otherwise. Audio codecs are right now listed as Wav, Aiff, and two more. I don't see AAC listed, which is the most commonly used audio code for digital content, and the default in most NLEs. The queue system is of course a great feature. But if working more interactively it would be nice to be able to just run the current workflow right from 'Overview'. It could still be in the queue in the background, but forcing everything to the queue creates a few extra clicks. I think when it comes to H.264 render UIs outside of the NLEs Handbrake and Wondershare are two of the most common converters (as well as the now obsolete Squeeze). Those UIs may be looked at as guideline of what should be possible to configure.
-
There is a difference between luminosity bands and frequency bands. The band control and keyer described above allow you to focus in on specific luminosity bands. And there are good sharpen and blur effects that can be combined with keying. But they affect all detail levels at once. Frequency separation works in a different dimension and is often used specifically with beauty retouching. Think of it as the very fine texture of skin, like individual pores, vs. medium texture of a pimple, vs. large texture of a whole cheek. When we use frequency separation we often want to affect these separately, meaning we can even out larger blemishes without loosing the finest texture of the skin to prevent it from looking fake. The level of blur (or defocus) you need to affect the medium skin would totally destroy the small texture. As the tutorial on the SGO website shows, the typical technique is to blur the image and then subtract the blurred version from the non-blurred version. That subtract yields the high frequency (smallest texture). Additional operation can then restore the original image. With the layers separated you can then mask or clone on them individually and only affect one range of texture. If you want three instead of two bands the stack gets a bit more complicated. There are no primary operations that can do this on their own. It either requires a full stack that performs this (like in the tutorial) or a dedicated filter that performs these operations internally. What I found is that the enhance filter with its three bands may in fact offering a defocus/blur for different ranges, which was the exciting discovery.
-
Thanks Cristobal & Yoav, I'll have to experiment with those... So many hidden gems in Mistika.
-
Thanks, that put me on the right track. I ended up with a slightly different eval tree, since I dislike copying the original clip. I added a second vector paint for the recovery as you suggested, and used the connect node to connect the original clip to the second paint node (has to be 1st input). Then I just used the 'reveal back' brush in those few frames to paint the original back in. That's pretty straight forward. Connect is a hidden gem to keep in mind. It can split any node into multiple paths. Jan
-
I'm using the Vector Paint/Clone tool to remove some skin blemishes. Works quite well, including the tracking if there is a small number of blemishes, and can save the trip through VFX software. One question: I have one blemish where the person's hand is moving in front of it part of the clip (occlusion). So I used the clone brush, tracked the blemish and copied the track which applies the clone to every frame. What's the best way of suppressing the clone brush for select frames? In my case the clip has 46 frames, and I want the clone be seen in frame 9-46, but not 0-8. I tried selectively removing these keyframes, without luck. Looks like the opacity can only be adjusted during the original clone operation, but not on a keyframe level? I could have a separate painted alpha channel to suppress the clone on the select frames, but that seems like a cannon to shoot a bird. Any better ways of doing this? Thanks, Jan
-
Thanks. I'm familiar with that tutorial, but it complicates the node tree by having to copy the original clip. You can work around that with a template group. Having a simpler effect to integrate into the node tree is what I was hoping to find. When I do need it, I often need it on most clips on the timeline, not just a single clip. The Resolve OFX effect I've been using with decent success has this user interface, which seems about the same as three stacked 'Enhance' filters, which is what I figured out yesterday. If you combine that with a good alpha mask or skin key exported form a color node, it should solve most problems we use FS frequently for.
-
In the open beta under the tasks there is a NVidia H264 node. But it seems to be tied to system with NVidia GPUs. On the current MacPro with AMD GPUs that leaves Mistika without any H.264 encoding task? That codec is required for any content destined for the web.
-
I'm finally finding some time to experiment with the various built-in FX and filters. In reading the manual and experimenting it seems there is a lot of power hidden in these that is not clear to many new users. Sometimes the names are pretty generic, other times there are many complicated parameters. One thing I've always missed in Mistika was some form of Frequency Separation to even out skin. Resolve has a new Blur/Sharpen effect that is divided into three bands. In reading the manual and playing with the Enhance filter, it seems Mistika too has that already built-in. You just have to stack multiple Enhance filters and setup the RadL/RadH accordingly. Is that a correct interpretation? Maybe we can add one more edition to the Masterclass series that focuses specifically on all the effects. The manual has some 'use examples'. Going through all this more interactively with the knowledge of the Masterclass presenters would be awesome.
-
Not sure if there's a pause. But you could try to use a display filter which reduces the resolution and complexity so it updates more real time. The display filter is easy to toggle and off via the T1.. buttons.
-
NVM - Figured it out. In the mConfig settings somehow the RED debayer was defaulted to a reduce factor of 2....
-
Tried to figure this out the last 20min, and can't spot it. Maybe someone knows? I imported some RED Dragon footage (4608x2160). I setup the project settings to that same resolution. The clip loads and plays fine, but for some reason it's coming into Mistika at half-resolution. Also on the display it plays in the center of the monitor with black framing equal to half the resolution. Here's what the clip attributes say, and here's the 'rescaled message' in the visual editor. Of course I can use a framing effect to adjust it, but I'm concerned that the clip somehow got loaded wrong and the framing effect will stretch a rescaled image instead of native resolution. What am I missing?
-
Updated to latest Mocha Pro. Still doesn't work in embedded/FX mode.
-
I can give it a try tomorrow and report back.
-
Tangent Wave 2 Support and Tangent remap options
jan1 replied to agonzalez@sgo.es's topic in Releases
Hi Cristobal, Thanks for the explanation. I was thinking it may have a second one, like the 'B' button that goes to the next bank on the Elements. I see in your mapping library that there is the item to switch to another button bank, but with the Wave only one exists. I guess the Tangent Mapper only has one defined for the Wave. Is there any documentation from Tangent on how their mapping works? Would love to look through it if you have a link. My Elements has 6 units (3x Bt, Kb, Tk, Mf). Jan -
Indeed. I think it founds it origins in AfterEffects and Avid, rather than the various OFX based tools. It seems that it's a lot more complex than your average plugin. There are supposedly some improvements coming in the new Resolve version, which may support it finally. How good it is remains to be seen.
-
Tangent Wave 2 Support and Tangent remap options
jan1 replied to agonzalez@sgo.es's topic in Releases
Two things to try: For the Wave2 the mConfig is slightly different. In the Tangent tab you need to check both the USB and the next checkbox, not the Elements one. Ran into that at first. Also on the AJA make sure that you allowed the kernel extension in settings. The best way to check is to run the AJA utility and see if it is happy with the hardware. Was an issue for me on my new laptop. Also make sure it’s a TB3 cable. I first used a generic USB-C charging cable, looks the same, doesn’t work the same. And after version upgrade recheck all your mConfig settings. Some of them get reset to defaults which is cumbersome. So make sure AJA io checkbox is still good, and that the Mistika UI is set to ‘live’. -
Tangent Wave 2 Support and Tangent remap options
jan1 replied to agonzalez@sgo.es's topic in Releases
Hi Cristobal, Indeed, it's a small panel and harder to map. And I'm happy to do experiment with creating a custom mapping, especially now that the library is so much more organized ? A few thoughts: The 'Alt' button, which is mapped similar to the 'A' on the Elements is a bigger issue on the Wave. Since I'm using the panel with one hand, on the Elements the 'A' button and the function button are close enough that I can do it with one hand, and in all cases those are buttons not knobs. On the Wave2 mapping the 'Alt' and some of the buttons are further apart requiring two hands, as well as any knob, especially some I might use more frequently such as cmy fixed vectors. Not sure if you saw how Resolve mapped the Wave2 (not that I really want you to copy everything Resolve has done by any means.... Just as a matter of getting some ideas)? They have a larger set of banks (more than 10 I believe). With the up/down arrow you can flip between banks (which is tedious), but more importantly if you push both up/down at the same time, they bring the panel in a mode where you use the button row for a quick select of the bank, and then have the whole bank for one feature. In the case of the keyer they have three banks for that. So you use the double up/down to select the keyer, and the single up/down to switch between the three keyer banks. In terms of me being able to customize the mapping, is there a way for me to add another button bank or change the behavior of the alt button from a shift to a toggle? I assume that involves some of the other .xml files. Do you know if there are examples or documentation. I briefly looked through them, but haven't fully decoded it yet. Thanks for all the hard work and being patient. I seem to have tickets or comments for you so often. Don't want to over-extend my welcome. -
Tangent Wave 2 Support and Tangent remap options
jan1 replied to agonzalez@sgo.es's topic in Releases
Hi Adrian, I installed that on my mobile system with the Wave 2. The instructions where a bit different for the Mac, so I may have made a mistake. I ended up having to add the Wave2 in the panel configuration and then load the wave2.xml as mapping. I like most of the mapping, but there seem to be some errors. For example when you get to Windows the different shapes are assign to the top row, not bottom row. The top are just knobs, so that doesn't really work. Same thing for add points in Curves. Unless somehow I loaded the wrong map? Love the up down error for undo/redo. But it's awkward to have so many functions just with the 'alt', like the cmy variant of the fixed vectors. I often have my panel next to the keyboard and operate it with one hand. Having to press the alt key and then operate a knob is not ergonomic for me. The Alt should be a toggle, not a shift. But with the newly organized library its so much easier to customize so I will spend some time to see if I can fine-tune it for my habits. Thanks so much! Jan -
There was a previous discussion about improving the license tool to avoid the hiccups during the monthly renewal and when using Mistika on laptops why may change network settings. I'd like to add one more request to that list - remote floating licenses, or at least little sibling licenses. The scenario: My main grading system is in my office. However, since my office is a tad far from many of my clients, I started putting together a second laptop based system which I can take to client's offices for supervised sessions. Because I don't have network connectivity to my office when I'm at a client, under the current licensing setup I need to maintain two full licenses, even though I never use more than one at a time. That's an expensive solution. Possibly solutions: 1) Use a different floating license server that does not require a constant network connection 2) Allow each license to be installed on two systems (Adobe Creative Cloud includes two systems for similar reasons) 3) Offer second license at a discount (a sibling license, etc.) Jan
-
Copied the files. Looks a lot better and easier to find things! Thanks, Jan
-
Hi Cristobal, Thanks for pointing this out. I saw your other post to Rakesh. Will definitely try it out. I may also check out the Wave2 mapping. I do have on for my mobile grading setup, but so far I have not run Misitika on it. Jan
-
As you know I've started using Mistika BT extensively. I wasn't really aware what the other tools (Review, VR, and Workflow) are doing, that they're in Beta and can be tested... I think there is opportunity to give them much more exposure. I downloaded Review and played with it. A few thoughts from my first experiments: - It's a helpful tool. The tool to beat is Telestream Switch, which many people are using who need an external player for a variety of footage. The other tool to watch is frame.io, which is quickly becoming the standard for review & comments (in a collaborative environment) and is doing a lot of UI development and tool integrations. - It's very cool that we can view so many formats and have the ability to control input color spaces, LUTs, etc. That is an advantage of Mistika RW that other tools don't have. - But there are quite a few features missing for it to be as useful: I didn't see any ability to play back audio (did I do something wrong?) It absolutely should leverage existing Mistika I/O and playback on external displays via AJA (and BMD in future) The meta data display should be more comprehensive (see information available in Switch) It would be nice to have some scopes and other basic analytics - Features that I'd love as UI improvements: Consider how people can add comments in frame.io. The text box is always open underneath the playback window. Once you start typing it stops playback, and once you hit save it automatically generates a new comment. No need to click '+'. There need to be ways to export comments in typical formats. See frame.io - most basic export is a CSV that includes TC with every comment. Better integrations for Avid can export comments as markers with notes. - I did load one EXR sequence and set input space and curve to ACESCg (also tried linear). The results didn't look right. Jan
- 1 reply
-
- 1
-
OK, good to know that I wasn't using it wrong... Until recently BorisFX had a similar issue with MochaPro in Resolve. The OFX implementation didn't provide sufficient access to other frames I believe. But the new 2019 release of BorisFX is supposed to fix that. I believe Resolve finally implemented the required changes to provide access via the OFX API. I'm assuming the same may be at issue with Mistika? Which means there may be hope for improvements before too long.
-
Ah, well, that's always an option. But than the claim that Mocha Pro is fully supported may be a stretch ? I've done the EXR export, work in Silhouette, render, and re-import already. That's a pretty typical workflow for me.
-
Would it be possible to get a short tutorial / example on how to use Mocha Pro. I tried a remove yesterday, and while Mocha Pro did it's job, I couldn't get the results back into Mistika. I know it's supported, but I must be doing something wrong. I first tried it on some 4K footage, to remove a small tattoo on person's arm. It was only 15 frames. I added the footage and the Mocha effect, it loaded fine, did the remove, closed Mocha and selected 'Render=yes' and 'Module=remove' and the UI becomes very unresponsive. Monitoring system activity it's not clear if it's rendering in the background or not. Eventually I got some partial output on on some frames. Then I tried it with a HD project, added a LUT3D and Mocha effect, as soon as I turn 'render=yes' again UI becomes very sluggish. If I open Mocha, the first frame is good, but scrubbing timeline inside Mocha only has black frames, like Mistika is too busy serving data to Mocha. Force Quit Mistika and tried again, as soon as I change 'Render=yes', and before launching Mocha UI, if I scrub the Mistika timeline, the external display updates, but the Visual Editor picture remains frozen. This is on a Mac / HighSierra on the latest 8.8.1 build of Mistika, and latest version of MochaPro.