It is a little known fact that Rotobot was trained in the colour space of the Internet sRGB or a Gamma of 1.8-2.2 with respect to light. This means that if your footage is not in sRGB it is not going to detect as well as if it were in sRGB. So how do you do that?
Using a OCIO Colorspace Transform node, you can come from an “in” color space of linear to an “out” colour space of sRGB
Kognat has had feedback that the processing times of a Rotobot are long, compared to the typical compositing or colour grade process which is a lot more interactive. To this end we have put up a video tutorial of processing footage with Rotobot, in an OpenSource OpenFX host called Natron, which is available for download from the internet, to run the software you will need to allocate trust to the developers, as it not certified by the operating system. The demonstration is on macOS, but if needed I can repeat the demonstration on other operating systems. It makes use of the command line interface called the “Terminal” on OSX, but more generally known as the command prompt or the shell.
This is a simple process, in large visual effect facilities this batch process would be divided up among many machines where each machine will process on frame of footage and respond that the frame is complete and ask for another frame or batch of frames to process.
Using an Free OpenSource OpenFX host means you can free up license costs. While calculating long computational cycles.
Full house at Lot 14 on North Terrace in Adelaide. A great opportunity to collaborate with other AI researchers, developers and business people. Looking forward to hear the presentations. Happy Launch Day Kognat.
Gamurs AI using computer vision and AI to analyse game play footage to improve the performance of e sports teams
Detecting subterranean vapour and liquid on Mars using ML on remote imaging and how it relates to wind patterns.
Great demo on containers and queues for non HPC based training by Adam from IBM
Frontier microscopy using “Marvin” rotobotic microscope and AI to detect asbestos in microscope samples of air filters to determine health risks. Well worth automating.
Please take time to read and share the following article about how the Rotobot OpenFX Plugin for Foundry Nuke (and others) is intended to be used in its current incarnation.
It is taken from the point of view of a visual effect process where there are multiple people in an upstream department completing the work of creating rotoscoped footage and people waiting on the work to an intermediate standard until the downstream department can start this work. Rotobot can make this wait time from days into minutes, using a few licenses and ordinary CPU based render farm resource.
On Saturday the 28th of April the Alpha Test of the OpenFX Plugin called Rotobot Mask RCNN started, to get results like the video above in about 12 seconds a frame from Nuke or similar, without a green screen contact us about trialing the plugin for 64bit Linux. (Update macOS also available, Second Update Windows also available)