MUSIC REPLACEMENT OF THE SHORT FILM “EL CAMINO”
This is my second assignment for the subject “Logic Pro X” taught by Jaime Bermudez, but having into account all the music narrative advice given by Ivan Capillas in his subject “Music for the Cinema”. I have replaced all the music of this excerpt of "El Camino" but have left the dialogue and the original sounds of the scenes, which were not very good quality and were not separated, because the material had not been edited yet. In fact, Ivan is the original composer of this short film and was he who provided us with the excerpt for the assignment
So, I do not own any of the images uploaded here or any of my social networks related to this film. | USE THEM ONLY FOR DEMO PURPOSES.
Here you can see the video and the Logic Pro X project at the same time:
I was not scoring the whole short film and, therefore, my approach in narrative terms was a bit different that if I had done so. I am assuming that my audience does not know the whole story, and the only thing they see is a very sad woman that is anxious through all the scenes until she bursts into tears at the beach. In a certain moment there is a man that appears to be alone in the darkness like probably missing her (or having something to do with her, or otherwise that scene would not be there), so I gave him a bit of protagonism by leaving his moment all out of music. This could have not been the case if I had wanted to give him another meaning when narrating the whole short film, but for this particular narration, I thought it would give him a tone of mystery, since the stop of the music is quite abrupt and that obviously attracts the attention of the audience.
The whole tone of the excerpt is deeply sad to accompany the feelings of the actress. Also I created a simple theme related to this sadness that I managed to make appear twice during the only four minutes that last the scenes. It is played by the strings, and led by the main violin, trying to make its ins and outs as smooth as possible. The very low strings were played in specific moments in which I wanted to transmit that the levels of stress of the actress went deeper. There was also another key issue in the excerpt which was the clock and the idea of the time going by, so when we saw the actress running through the streets, or in the car rushing, I wanted that clock ticking to accompany her with the synths, but in a way that was not that obvious so it was not that repetitive for the listener.
The instrumentation is based on strings and orchestra sounds that come from virtual instrument plugins, although many times they have been rendered into audio so the sound could be changed in a different way that it would be done with the MIDI files. There are other instruments that have more of a synth texture but that fill some frequency gaps among the others and work a bit like a glue sound. In fact, some high synths are trying to recreate the harmonics that a real violin would create, but that are not that easy to recreate with a digital library. Others are just there trying to create more tension. The very low strings, which in general I call SUB BASS (see picture below), are also used with that purpose, as I mentioned in the previous section.
I did not have access to sounds and dialogue separately, so at the end of the video, when the actress speaks, I cannot process her voice without processing the sound of the beach waves at the same time. I used some Izotope’s RX De-hum and for making the waves less bothering and Neutron’s EQ and exciter on the voice’s frequencies to raise its presence. The RX noise-remover was perfect for the moments of the clock ticking at the beginning of the video.
On the other side, something I worried about was the low frequencies that all the orchestra and pads instruments were accumulating and had to be cleaned up. One of my best tools in that case was Izotope’s EQ, which you can put in different tracks and have them connected so you have it easy to compare and see clearly what frequencies are clashing. I was also careful with my SUB BASS sound so it was constant when it sounded and did not accumulate to other instruments low frequencies.
Another issue that is quite normal that it happens when you work with many orchestra library sounds with its own reverbs and acousting settings, is that lots of frequencies of that reverberation zone are also accumulated and can cause an unwanted background humming. So I tried to use as less reverb possible without losing the depth I wanted, and was careful with the EQ, not only in every track as I mentioned before, but also in the master track, where I dealt with some already mastering issues with the Ozone plugin.
It is not the first time I record myself, but it was a requirement of the assignment. So I recorded my voice and yes, as you can see below, into little chunks because they were long sustained notes and I do not have very big lungs to be honest.
The idea was to create a choir organic sound so it gave the music an even more sad feeling along with the orchestra sounds. Therefore I recorded several takes of the same notes of the harmony, some to be drier in reverb than others and so to create a depth that did not lose definition of the characteristic texture of the female voice.
In the scene of the car there are two synth layers that give the background rhythm and want to convey the feeling that the actress in the film has her head going on very anxiously while driving and that she is in a rush to get to the place where she needs to find out what worries her.
For the last Logic Pro X assignment I worked with Alchemy, so for this one I wanted to experiment with a different one, which was the Prism of Native Instruments. I found it not very intuitive at the beginning when looking for the oscillators, filters and other basic modules of a normal synth. However, after playing around a little bit with it from some of its presets I found the the sound I was looking for the motor of the scene.
Both layers do exactly the same notes but the first is the lower frequencies that introduce the scene, and the second are the higher, but not that much, that give openness in sound as it goes on, until both layers disappear into the next image.
To lock the SMPTE (which stands for Society of Motion Picture and Television Engineers) position of the clips means to lock them to the frame of the video (if there is one, if not it can be done anyway), so the clips are not tempo changed or moved from where they are when making tempo variations in the project. It is useful when you want to change the BPMs because your next cue is different in tempo but you do not want your clips so far to be messed around by mistake.
In this occasion, I really did not need it since I felt that the tempo of 85 BPMs suited the four cues very well, and I did not need to change the BPMs when I had a more loose section in which instruments had their own pace since I was playing them myself with the keyboard. However, the project is currently locked in case I would like to continue it one day, and it is a feature of Logic Pro X that I will certainly use in the future with longer videos.
Logic Pro X has many different options to transform the Midi settings of your arrangements in a fast way. I did not process much my Midi files because I usually record the orchestra instruments by playing them in the keyboard so they keep that humanized feeling I want to give them. However, I wanted my synth clips to sound very mechanical, and so I used the Midi Transform tools to quickly fix a perfect velocity and length for each of the notes: