Behind the Scenes of Hate Being Happy 4U (Spoilers)
- Justin Janoson

- Jul 31
- 9 min read
BACKGROUND
In late April, I made the decision to invest in a Sony FX30 to replace the a6400 I had been using since high school. While the a6400 was incredibly capable, its rolling shutter made handheld shooting a challenge. Additionally, the lack of dynamic range compared to more modern cameras was becoming increasingly apparent, leaving me struggling in post-production when trying to manipulate the image.
To test the limits of this new camera, I reached out to a few individuals to see if they would be willing to assist me in putting it through its paces. Although I couldn’t offer any compensation, I did provide lunch, reimbursed travel expenses, and promised to give them the content for their reels.
The incredibly talented Maya Borhaug accepted my offer and mentioned that she had a friend, Baron Leung, who would also be interested in acting to generate more content for their reel.
Initially, I had planned to write a scene from scratch. However, I had actually met Maya when she auditioned two years ago for an early iteration of Hate being Happy 4U, which was supposed to be a short film. Unfortunately, due to financial constraints and family/personal responsibilities, production was put on hold.
Seeing this as an opportunity, I decided to explore how well the fantasy and reality concept would play out before any actual shooting took place. I wanted to provide something unique and special for her reel.
PRODUCTION ON THE REALITY SEQUENCES
The reality scenes were shot using a Helios 44M 58mm F/2 lens with an M42 to EF mount adapter, then a Viltrox EF to E speed booster on my FX30. My original plan was to actually shoot these sequences with the Sigma 30mm f/1.4 DC DN Contemporary Lens (Sony E), but at the last minute, I switched to the Helios 44M 58mm F/2 lens, and I honestly think it came out better for it.
Despite knowing the shot would not be moving, I made the decision to mount the tripod on wheels. Why?
One of my first paid jobs in film was as an assistant editor for a filmmaker named Anthony Costa on his feature film. He told me about how he often mounted the camera on a dolly even if he wasn’t using it for motion, as it allowed for a shorter turnaround time between setups.
Despite our camera package being fairly light, because my ultimate goal is to make our sets as inclusive as possible, mounting the camera on wheels allows those who have mobility impairments, use wheelchairs, or experience fatigue easily, to more readily move, position, and operate the camera. This reduces physical strain and ensures that individuals with disabilities can actively participate in camera operation, fostering a more accessible and equitable production environment.
Audio for the entire production was captured using our Movo WMX-20-Duo wireless audio system, with the lavaliers being taped under the talent's shirt using bandages.
Keeping this sequence shot on sticks also meant that the reality sequences had a distinct visual language from her fantasy.
The Helios lens and speed booster also tend to get less sharp at their edges, so I tried to keep all the action in the center of the frame.
SHARON’S FANTASY WORLD PRODUCTION
I wanted the sequence of Sharon confessing her feelings to be an out-of-body experience. Initially, I had written this sequence with Sharon and Josh floating towards each other using a Spike Lee-style double dolly shot. However, we didn’t have a dolly, so we came up with an alternative solution. Since I live near a shopping center, we “liberated” a shopping cart from the parking lot and walked it across the street.
To keep the tripod an equal distance from cart we used a swiffer and boom pole to ensure that it was equal distance from the cart (Forgot to get photos but it looks as goofy as it sounds). Here is our initial test.
It didn’t really work but my plan was to switch from the 28mm continental optics f2.8 m42 lens to my Sony E 18-135mm f/3.5-5.6 OSS and mount it on the gimbal to reduce some of the bumps from the parking lot. I wouldn’t get the bokeh and unique look I wanted, but at least I would most likely get a more usable shot.
But someone stole our “liberated” shopping cart while we were out to lunch so we pivoted to just using the gimbal. I had tested the Helios lens with a wireless follow focus when mounted on the gimbal and it was okay, but I didn’t feel comfortable enough to use it for these shots since I knew I wanted it to be one long take. So I ended up using the Sigma 30mm f/1.4 DC DN Contemporary Lens (Sony E) with a variable ND filter in front so we can shoot it wide open.
The sequence was shot at 59.94 fps at a shutter speed of 1/120 because I subconsiously wanted it to feel different from reality.
Due to the limited budget of the production pulling off the fireworks scene was challenging. From the start, I knew I didn’t want to use a green screen as I wanted to moves to appear exactly as they do in camera. Plus even If I wanted to, we didn’t have the crew to pull it off due to the spinning nature of the shot.
I knew I wanted her world to have a distinct feel so subconciously the viewer would always know we were in it so I opted to shoot the sequence at 60fps and at a different shutter speed.
EDITING AND COLOR GRADING
Aside from using some free audio plugins availble from Soundly and acouple of vfx shots to fix timing and remove personal info, everything I did was fairly basic. The transition from reality into Sharon’s fantasy included some details that might be hard to pick up on. For example, our last shot in reality has Josh’s hand on her shoulder, but the first shot of fantasy has his hand on the car and her head is in a different position.
My editing style has always tended to focus less on continuity and more on using the footage to portray a specific emotion since the camera itself is a character. I figure that when a character is retreating into their mind, they might not necessarily feel what is going on in the world around them.
One of my favorite things to do is allow the grade to express the characters emotions. Especially if we are clear on whose story this is and for this chapter, we are looking at it from Sharon’s perspective.
Throughout the sequence, we are cross dissolving between two different grades from the more muted colors of reality to the vibrant colors of fantasy that make me want to vomit until we get to Fantasy Josh confessing his feelings.
But most of the look for this project was built to work with the footage as if I were developing a real film stock. I didn’t just want to slap a lut on the project and call it a day. Every piece of the final color was built to mimic the characteristics of Kodak 250D, halation, and all. The gimbal shots do contain some extra blur to better match the less sharp Helios lens but its not a radial blur tis more uniform to match the look of the lens itself.
I pulled up some shots from films like The Fablemans since I knew it was shot on the film stock I was trying to emulate to ensure that I could keep color reproduction to what is available.
Something that I would like to do in the future to help nail this look in the future is what Director Adam Carter Rehmeier did on the set of his film Snack Shack where before they shot he took a film camera and shot a still of the scene then gave that to the colorist to colormatch.
I also used Google's Gemini AI to help me code a system for the credits that would allow me better organize and more quickly create endcards for films from spreadsheets similar to paid tools that exist. However, this solution struggled when it came to the patreon subscriber thanks so I ended up using the same photoshop template I used for other films and credits I’ve done for clients. The base came from a photoshop file from Indie Film Hustle I've tried to find a link for it and I don’t think it still exists on their website so that is why I'm not sharing a link here.
VFX
My original plan was to tackle the fireworks sequence was using the rotobrush and 3d camera tracker native to After Effects.
However, I ran into 2 issues.
Rotobrush was taking forever and my ADHD brain was being way too impatient as it attempted to tackle all 702 frames in the sequence. Plus I would still have to mask and track the buildings.
The shallow depth of field and lack of ground caused AE’s 3d camera tracker to go on strike.
To tackle the first problem I thought back to last year when I had used photoshops AI depth matte feature combined with a tool called Volumax I had bought years ago for AE to give 3d motion to photo in an instagram reel I edited for the podcast, Unapologetic: The Third Narrative. I had also seen tutorials for Davinci Resolves AI map feature and I wondered if there was a tool worked similar in After Effects that can be applied to videos.
I found paid plugins, but I didn’t want to spend any money as I had no idea how well this would work especially on a shot with low depth of field (Also many are just reskins of free tools anyway). After scouring youtube I came across a tutorial for how to do it for free by downloading a program from GitHub. I can't recommend that you download random files from Github, but it worked for me so use at your own risk.
This program is only available for windows and not Mac so I sent the scene over to my desktop and this was the result.
Once I had this matte, I used the levels and lumakey effects to control where I wanted the fireworks to be excluded from. I actually have 3 separate mattes that we are cutting between.
I had to start placing the fireworks. But I still hadn’t tracked points so the fireworks could realistically fit within the space. I decided to bring the footage into Blender to see if I could get the track there, but that also failed.
My solution (while janky) was to track individual points in the background such as the buildings, trees, telephone poles, etc. using Mocha AE. I then applied the transform data to individual nulls timed specifically to where that object appears on screen; however, because the camera revolves around the talent, there is no object that stays in the frame for its entirety besides Sharon.
I decided to approach the shot as if I were Spider-Man. What do I mean by that? Well, when Spider-Man swings across the city, he shoots a web that attaches to a building. But at a certain point, he has to decide if he will let go and move forward, or hold on and swing back. So I parented each firework asset (Courtesy of ActionVFX and ProductionCrate.com) to a null. When that null’s track ended or started to get wonky, I would cut the clip and attach it to a new null and repeat the process. Most of the fireworks would be attached to 3 separate nulls, and at some points, you can tell when the transition happens, but for the most part, viewers feel they fit into the scene as well as they could. In total, the fireworks comp had 134 different layers.
Once I had the timing where I liked it, I would turn off the footage layer and then export the fireworks solo as an Apple ProRes 4444 with alpha and placed that over footage.
The depth of field effect on the fireworks is not physically accurate to the real world. Once I had the timing right, I just eyeballed it using fast box blur. If you look closely, you could probably tell that everything is the same amount of blurred, but since most people will be watching this on their mobile devices, I said it doesn’t really matter.
I also tried using a different AI tool to get a normal map so I could relight the actors to be affected by the fireworks in Blender, but since the sequence takes place during daytime, it looked unnatural, and I decided against it. Here are the maps that it spat out incase you're curious.
If you want to use the tool for yourself, here is a link for that.
Once I had the comp looking good, I exported everthing in the Slog3 colorspace so I could manipulate the image in post. When I was doing the VFX, I wasn’t looking at it in slow motion or with the zooms that were in the final because I wanted to treat the footage as though it was shot practically. The first time I saw the footage slowed down was when I brought it into premiere and began to experiment with the speed.
Everything else used similar principals of tracking and removing things here and there. I alternated between the clone stamp and generative AI fill in photoshop to create a clean plate and track that to be good enough. But no shot was designed from the start with AI.
CONCLUSIONS
In total we had a 2 1/2 crew members (Abhinav was only available for half the shoot) plus 2 cast and we were able to produce a fairly polished project at a super low budget and a relatively quick turn around time.
Going forward our strategy for originals will be to focus more on what we can produce internally to grow rather than following our previous strategy which relied on pitch outside companies to either invest or co-produce.














Comments