Upon reflection, I acknowledge that my process has been delayed significantly due to my mental and physical health this term. I have been unable to finalise on an idea that I want to work on, also being behind in class has really struck on my confidence in my skill and vision. This has always been a struggle for me as a 3D artist & being affected by social media. Though seeing other artists’ works can be great source of inspiration, it has inevitably got me lost and anxious into choosing a direction to move forward. This is why I have had many UE5 scenes being made then abandoned, yet they all have somewhat helped me practice my knowledge in UE5 just by remaking new scenes over and over again until I made something that really fulfilled my creative vision. I am thankful for the patience and help I got from my classmates and tutors throughout this term <3


For this project, I want to go purely for an aesthetic to help me conclude and practice what I have learned throughout this VFX Fundamental Unit. I have always love the beauty of nature mixed with fantasy that brings a familiar sensation but at the same time transcend further into something out of ordinary. Below are illustrations/concept art that conveys the mood and the aesthetic that I have been looking for in my work. They tend to bring a sense of peacefulness, while act as an invitation for us to step into that world and explore further.

Keywords: Etheral, Flow, Dreamcore, Nature, Mythical,…

My creative process has always been rather intuitive and experimental, so I didn’t exactly planned or blocked out my scene. I went back and forth a lot in creating this scene, my process is definitely not linear, however I will try to summarize the techniques I used below.

I was envisioning a pond at first, and have attempted to tried out different water system before decided on using Water Body River, as I prefer its type of wave more than Ocean or Lake. I really like the original flow curve of the river already, so I went back and forth between sculpting the landscape and extruding the river, morphing it into a yin-yang shape. I also didn’t like how the landscape is square and wanted a continuity between the river and the ocean body, so I used Flatten brush to push all the surrounding land down to connect the river with ocean. I also reduced the flow of the river water so the waves look more calm.

Before
After

For the landscape material, at first I tried out the technique of creating different material layers to paint on, yet I didn’t like the result very much as I need more variation to my surface in order for the landscape to look realistic. Yet it only works up to 3 layers, any more layer I created after that started to make the paint glitches and not showing up properly. So I found this Foliage addon that has the type of grass I wanted to use, but also a landscape material that consists of many organic layers that look realistic while being used together.

I was going for more of a dreamlike palette, so I changed the colours/tint of all the materials by going into the base color textures and modify the hues.

Changing the hues of each material
The material layers now follow the color palette that I wanted
Adding trees

I initially was going to model my own trees and foliage, but ended up using add ons I found on the marketplace as I wanted a realistic look and it is very time consuming to start from scratch. However all the models and foliages looked very basic at first so I went in and modify all the textures to turn them into something of my own.

I spend a lot of time painting the foliage and changing its colours, mixing between shades of white, green and pastel blue to further enhance the dreamlike look. I bumped the foliage density up really high for most part as I want the land to look like a cloud floating on water.

Similarly I did to the mushrooms here, but I also increase all of its sizes to make them look more surreal, like how in Alice in wonderland everything is either too big or too small.

At this point I am very satisfied with the overall look of the scene. For the environment I purposely increase the fog’s density to make it more cinematic and creating a god ray effect through the trees. I also changed the cloud material, speed and hue to match it with the overall look.

Previously I have modelled the balloon for this project, this time I created shaders for them to make 2 different colour variations. I wanted a fun fantasy look that feels dynamic, so not only I animated the movement of the balloons to rotate but also add some shakiness and tilts to them. I purposely chose glass and glossy material with the intention to make them the main highlight of the scene. On the balloon itself is a shader that I also animated in the end to make a lava lamp/lightning effect that ended up looking really cool.

Balloon 1
Balloon 2
My Nuke setup

In today’s class, we learn about stages in Production for film in compositing,
– Temps/Postviz? – Rough version/mock up?
– Trailer – Key shots
– Finals
– Quality Control

Project Management Softwares:
– Google Doc/Spreadsheet/Notion
– Ftrack
– Shotgun

Production roles:

  • Line Producer: The person who manages and keeps track of the whole team in terms of production.
  • VFX Producer: ensure studio projects completion on time. Strive to complete project by the set deadline and within budget & available resources.

Tech Check before publishing a version of your work:

  • Check if you did all the notes of the shot/ follow briefs
  • Compare new version with previous version
  • Check editorial? Any retime in the shot?
  • Has your shot had the latest CG and FX?
  • Does your shot have the latest camera match move?
  • Write in personal notes
  • Do you have different alternatives for one shot?
  • Quality Control

  1. Roto Paint (P)

The RotoPaint node gives you a broader scale of tools to use than Roto, though many of the controls are shared across both nodes. It is used to clean up and clone out unwanted elements from the plate. Cons: Heavier than original Roto node

Brush settings
Cloning the house using Rotopaint node
  • Clone: Ctrl + click = choose clone source. Shift + drag = change brush size
  • Reveal: Reveals the original background after cloning (change paint source from ‘fg’ to ‘bg’ to take effects).
  • Paint: Paint/draw on the shot
  • Blur, sharpen, smear, dodge, burn: Adjusts the area painted on the shot (similar to Photoshop).
  • Dodge for highlight, Burn for shadow
Lifetime type has many options, we typically use all frames, or you can use other options for transitioning.
make sure to have the paint on alpha by change output mask to rgba.alpha to premult
be mindful to change source if rotopaint doesn’t seem to be working

2 ways to separate rotopaint from the plate

Using Difference node
Using Merge (divide) then (Multiply) back

Grain

  • Film grain is the texture in photographic film, caused by small metallic silver particles developed from light-sensitive materials.
  • Unlike digital noise, which is electronic interference in digital cameras, film grain is a physical attribute of analog film.
  • Grain and noise both impart a comparable appearance, sensation, and texture to filmic images. Techniques utilized for footage from both digital and film sources, such as adding grain, adjusting perspective, and applying denoising, help in creating a scene that feels more natural and organic.

Setup 1
  • When working on grainy footages, we will typically start with denoising procedure first. This step is essential to help the tracking of elements that we want to replace in the scene later on.
  • Once the editing is completed, grain is re-added to the video to restore its original texture and maintain visual continuity throughout the shot

Setup 2
Setup 3
  • Denoise Node: Link to your source footage the set a bounding box to analyze the existing noise in the footage. After this, you can fine-tune settings like the amount of Denoise, Smoothness, and other parameters.
  • Grain Node: Can be quite difficult to match the grain to source footage as you’ll need to go through different RGB channels one by one and adjust settings like grain size, irregularity, and intensity to achieve a close match.
  • F_ReGrain Node: Offers a more precise grain matching compared to the Grain node. You connect the grain to the original footage and then link the Src (source) from the shot you are trying to replicate. Note that this is available only in NukeX and is much heavier than Grain node.

DasGrain node

  • normalised grain – average denoised, clean plate and cleaned up plate
  • common_key = look for the difference between clean plate and cleaned up plate

Homework

Before cloning procedure
After cloning

I tried to match grain using Grain node by going through RGB channel. I wouldn’t say it looks perfect, especially for Blue channel. But the result seems to be pretty good at the end.

R Channel
G Channel
B Channel

Concatenation is the ability to perform one single mathematical calculation across several tools in the Transform family. This single calculation (or filter) allows us to retain as much detail as possible.

– do not put grade/color correct in between transform node as it decreases the quality of the footage / more blurry
– the only filter matters for transform node is the filter at the end of the chain
– the only motion blur for transform node is the motion blur at the end of the chain
– use clone (alt+K) if you want to adjust multiple objects at the same time using the same value → saves time and avoids having to change nodes individually.
– most of the time you’ll be using cubic as filtering

Correct Merging procedure

Bounding Box BBOX = area where nuke will read and render information from
Plate > S = project setting

bounding box is represented by the dotted frame

3 ways to match bounding box:

  1. use crop node after transform node if your bounding box exceeds the root resolution so Nuke doesnt read information outside of the format
  2. use merge node → change set bbox from union (default) to B
  3. if use roto node → copy alpha → alpha → premult = it will automatically match bbox to roto

Working with 3D

  • Press tab in viewer to switch between viewport to camera view
  • hold ctrl + left click to orbit
  • using reformat node to limit bbox to the size of original 3D elements in the scene

Shuffle node

Shuffle node represents the input layer – output layer
Shuffle create new channels for output and read different layers from many inputs both internal like the ones we created, but also multi-channel EXR renders where there are multiple LAYERS and multiple CHANNELS per image.
→ useful for manipulate rgba channel + depth

PLANAR TRACKER (perspective shift / 2.5 tracking)

PlanarTracker is good for perspective shift, but not for spacial movement

if there’s an object in the front, top viewer drop down → add new track layer, roto it and remember to put its folder ON TOP of the roto folder of the object behind. This will tell Nuke to exclude tracking the object in the front.
– absolute: shrink the source image
– relative: shrink the source image less

Zdefocus

Zdefocus node uses a focal point to generate bokeh, creating more of a film look than the Blur node.
Cons:
– Elements are a bit fake by eyes
– NukeX 14 has Bokeh node, which uses camera’s information so it’s more accurate to the eyes

Using Focal_point to pick subject/area to focus on
Convolve > Switch > Any shape (roto)/text can be used as bokeh shapes/blades

Homework

For this week assignment, I need to use Planartrack node to replace the 2 posters in a scene that has perspective shift. I first didn’t understand why the CornerPin and the posters’ corners were not matching, then I learned that I have to click on the Premult node first before adding the CornerPin, so it will be generated from the poster itself. Then I roto the pole and added it back on top of the poster.

As a final touch, I attempted to replace the wall with a brick wall texture using what I have learned about Planar Tracking. I tracked the frames/windows around the poster, then track the wall using the same technique as the posters’ replacement. The result turned out quite nice as the color matches with the whole scene, but I think the image texture is still too sharp and not perfectly blended with the wall so it looks just a bit off.

A major problem that I had was not really understanding how to stabilize the brick wall texture so it doesn’t move with the camera shift. I have tried Stabilizing from Tracking, and also using teacher’s advice to try tracking the wall using smaller area instead of tracking the whole wall like I did at first, yet it’s still not working at all. I might have to revisit this project to figure this out.

Node setup

Creating a Cine Camera in Unreal Engine

Unreal Engine offers three methods to create a Cine Camera:

  1. Right-click in the viewport > Place Actor > Cine Camera Actor
  2. Use the ‘Quickly add to the project’ button next to the Mode option, select Cinematic > Cine Camera Actor
  3. Click on the sandwich menu, choose Create Camera Here > Cine Camera Actor

Camera Types

  • Camera Rig Rail: This mimics a real-world rail system, enabling the attachment and animation of the camera along a predefined path.
  • Camera Rig Crane: Similar to a crane in real life, this allows for the attachment and animation of the camera with crane-like movements.
  • Cine Camera Actor: This camera type provides detailed options for Filmback, Lens, and Focus, aligning with industry standards to create realistic scenes.

Features of Cine Camera Actor

  1. Piloting: Navigate the scene effortlessly by switching the view to a specific camera. Change perspectives by selecting ‘Perspective’ in the viewport, choosing the camera, or right-clicking in the viewport and opting for ‘Pilot’ followed by the camera’s name.
  2. Picture-in-Picture Display: The ‘Preview Selected Cameras’ option in Editor Preferences can be toggled on or off, allowing you to preview a camera by selecting it in the Outliner. This feature is enabled by default.
  3. Look at: Focus the camera on a specific object. Set this by adding an actor for the camera to track, then in Lookat tracking settings, select ‘Actor to track’, pick your desired actor, and turn on “Enable Look at Tracking.”

To change the viewport layout, click on the Sandwich bar, go to Layouts, and choose your preferred layout division. Maximize the current view with F11.

Incorporating Cine Cameras with Post Process Volume is vital for configuring Depth of Field (DOF) and Exposure. These settings are accessible in both Cine Cameras and PPV, with PPV offering global adjustments.

To view DOF in the viewport, navigate to Show > Visualize > Depth of Field Layers.

Post Process Volume (PPV) – Exposure

Local Exposure: Useful for consistent imagery when detailed scene lighting is impractical. Always set this up with Lumen Global Illumination.

Camera Setup Workflow:

  1. Define Filmback (scene size).
  2. Adjust Depth of Field (DOF): Set aperture, focal length, focus distance. (Example: Narrow DOF used)
  3. Adjust exposure: Set shutter speed, ISO.
  4. Create and adjust the Exposure in Post Process Volume (PPV).

Important project settings for Lumen
Point light
spot light
rect light

Env. Light Mixer – Create lighting from scratch:

  • Create:
    • Sky Light
    • Atmospheric Light
    • Sky Atmosphere
    • Volumetric Cloud
    • Height Fog

Other elements to add to create realistic sky/world/lighting:

  • Volumetric Cloud: Uses a material-driven method to create lifelike clouds, offering versatility in cloud types and enhancing the sky’s realism.
  • Exponential Height Fog: Adds atmospheric fog that varies in density with altitude, providing a smooth transition and allowing for two different fog colors for environmental tuning.
  • HDRI Map: Uses an environmental texture to provide accurate background scenery, natural reflections, and contributes to the overall illumination of the scene.

Things to keep in mind when dealing with indirect lighting:

  • Base/albedo color: the material or color of your objects matter as they reflect light bouncing off them. If lighting in your scene seems too dark/light, considering tweaking color of your material.
  • In real world nothing is 100% black or white. Most black: 0.04, most white: 0.9, middle ground: 0.18
  • Use chrome ball to visualize lighting & reflection

To turn off Auto Exposure:

  • Add PostProcessVolume to scene
  • Infinite Extent (Unbound) ✅
  • Metering Mode: Manual
  • Apply Physical Camera Exposure ✅
  • Exposure Compensation: USE THIS to control light (without having to manipulate light in your scene)

Experimentation

After several attempts in creating beautiful skies and environment lighting in UE5, I started to get a hang of how to work with Lumen. There are still a lot of settings which confuses me sometimes, yet I believe lighting is such an essential aspect responsible in deciding the quality and mood of your scene.

References

Transform node deals with translation, rotation and scale as well as tracking, warping and motion blur. Sometimes you want to animate these values just by using one transform node, but sometimes it’s better to use rotation or scale node separately to understand the process better.

Using rotation and scale node to separate individual operation

2D tracker

Tracker node: container of pixel in x and y
– Allows you to extract animation data from the position, rotation, and size of an image.
– Using expressions, you can apply the data directly to transform and match-move another element.
– To stabilize the image you can invert the values of the data and apply them to the original element
– We can also make several tracking nodes from the main Tracker node to automatically make the scene stable, match the movement, and either reduce or add shakiness.

General process for tracking an image:

1. Connect a Tracker node to the image you want to track.
2. Use auto-tracking for simple tracks or place tracking anchors on features at keyframes in the image.
3. Calculate the tracking data.
4. Choose the tracking operation you want to perform: stabilize, match-move, etc.

2D, 2.5 & 3D tracking


– 2D track: x & y
– 2.5: still x & y but in 4 points to mimic a sense of perspective. Use planartracker for this.
– 3D: XYZ


– inner tracking square: track the main shape/point
– outer tracking square: find movement around the inner square to track it

Pre-tracking treating:

Sometimes we should treat the original plate to obtain better tracks, if the scene is too noisy or grainy. In this case, we use a ‘Denoise’ node to make the noise or grain less, helping the tracker read the changes between frames better. We can also use tools like ‘Laplacian’, ‘Median’, or ‘Grade Contrast’ to fix the grain issue.

  1. Denoise plate (denoise node – median node).
  2. Increase contrast with grade node.
  3. Lapalachian node can help in certain case to lock better tracks.
Denoise footage to improve tracking
Stablize operation and compensation using transform nodes

It’s always important to use a Quality Control (QC) backdrop to make sure the tracking and any added rotoscoping is done right.

Homework assignment:

I attempted to do this assignment twice, as the first time I was really confused of the process and messed up the nodes. Doing it the second time made me realize that I first need to track the 4 points on the phone to create a Transform_stabilize node. This comes before the first Merge operation, following with a Transform_matchmove from the same tracking. Doing this ensures that the phone mockup is merged correctly with the tracked points.

I was not satisfied with the roto of the fingers at first because of the green spill. In this particular node setup as well haven’t touched on green spill operation, yet I managed to compensate for it by using Filter_Erode node with a slight blur to the edges to make the roto not too obvious.

I also used Erode node for the phone mockup as it doesn’t fully cover the green screen at the top of the phone, no matter how accurately I tried to adjust the CornerPin2D.

Before
After
Final node set up


Homework feedback:
– My final work is good but personally I was not satisfied with the roto of the finger as it is still woobly, I want to learn how to roto better in the future.
– I learned that I could have used Curve Editor to smooth out my animation by using curve editor to control animation from linear to flow
– x → f → press h on in & out points (for easy in & out)
– y → f → press h on in & out points to smooth animation or move curve

This week, I tackle on Landscape and material in Unreal Engine 5. In terms of 3D I have always been focusing more on character design, yet creating real-time landscape with great details has been something that I always wanted to implemented in my work as I believe it would be a great storytelling enhancement.

Resources:
Unreal Engine 5 Beginner Tutorial for Film: Landscape and Materials
How I Quickly Create 3D Environments in Unreal Engine 5 | FULL WORKFLOW
Landscape Basics Tutorial for Beginners in Unreal Engine 5.2

Creating landscape
1. Stablishing Scale: this is to make sure your landscape is scaled correctly by adding a mannequin
– Content browser → add → add feature or content pack → third person
– Mannequin → Character → Mesh → SK_Mannequin

2. Landscape mode: Here you can create landscape by manually paint/sculpt on the plane OR use height maps
– Settings: Section per component = to subdivide square/section for higher res landscape
– Drag downloaded surface material into Landscape material

** Be aware of tiling: you can compensate it by modify tiling X & Y in Shader Editor

Before
After


After trying out different surface textures from Quixel Bridge, the landscape still doesn’t look exactly realistic to me if I just use 1 texture for the whole landscape. I have eventually learned how to create different material layers in order for me to paint onto my own landscape, using these tutorials.

From 2:10:00

How to create different landscape material layers:

  • In Content browser, create a material folder. Right click > New Material, name it as Landscape
  • Open up the material, add LandscapeLayerBlend node (This will tell UE5 that this is a landscape material)
  • On left panel, click + to add new landscape layers. Name them accordingly
  • Import Textures into UE5
  • Add MakeMaterialAttribute node, connect Base color (RGB), Normal (RGB) and Roughness (G) to it.
  • Add Texture Coordinate, Multiply and ScalarParameter (for tile), connect to textures’ UVs
  • Right click on material > Make Material Instance, drag this into Landscape Material. If you open this up now you can modify the tile that has been added beforehand.
  • Repeat accordingly to create more material layers.
Tip: Hold alt + click on lines to break connection
Ctrl + W = Duplicate a node

Example of how to create and organize material layers.
I tried to mock up my scene in Blender, obviously was not going as well as UE5 since I used hair particle system to create the grass and it was VERY heavy
experimenting with painting and sculpting Landscape in UE5


Premult = multiplies the input’s rgb channels by its alpha

Keymix = similar to Merge but accepts unpremult assett. Often used for merging masks

Uses of Premult node:
– Merging unpremult images = to avoid unwanted artifacts (fringing around masked objects)
– Color correcting premultiplied images.Unpremult color correctionpremult

Unpremult = divides the input’s rgb channels by its alpha

Colour correcting premultiplied images:

When you colour correct a premultiplied image, you should first connect an Unpremult node to the image to turn the image into an unpremultiplied one.
Then, perform the colour correction. Finally, add a Premult node to return the image to its original premultiplied state for Merge operations. Typically, most 3D rendered images are premultiplied.
** If the background is black or even just very dark, the image may be premultiplied.

Merging Operation Examples:

Merge (over)
Merge (mask)
Merge (average)
Merge (overlay)

Reformat = lets you resize and reposition your image sequences to a different format (width and height).
– Allows you to use plates of varying image resolution on a single script without running into issues when combining them.
– All scripts should include Reformat nodes after each Read node to specify the output resolution of the images in the script.

Colorspace and Linearization
– Colorspace defines how the footage was captured or exported. Most files are non-linear, and knowing the correct colorspace is critical for proper linearization.
– Linearization is the process of converting footage into linear space. All the tools inside nuke are built around linear math. This also allows the mixing of media types. We need to know what colorspace of the file was before starting to work on the file.
– You can work in LOG/RAW or Linear.

LUTs, CDLs, and Grades
· LUTs can be used for creatively or technically, i.e. converting from log to lin, or adding a “look”
· CDLs are creative, i.e. adding a “look” to a clip
· Graded footage means colored to it’s “final” look

For color correction we always want to think in terms of:
– Highlights
– Midtones
– Shadows
Two commonly used nodes are : grade & colorcorrect

Both gives us the chance to grade H, M, S of a shot

– To grade Highlights we use either GAIN or MULTIPLY,
– To grade Shadows we use LIFT
– To grade Midtones we use GAMMA.

How to match color using grade node to match constant A to constant B
– Add a grade node (G), pick constant A color as WHITEPOINT by selecting eyedrop -> ctrl + shift on constant A color. 
– Pick constant B as GAIN by selecting eyedrop -> ctrl + shift on constant B

2 examples of basic grade matches using merge node
Match color

Note: Cloning a node will keep the same value/setting across the nodes (signified by a C letter on top of the node)

Primary color

secondary color

QC quality control