A list of puns related to "Bus Master"
Hey everyone. Iβm an amateur producer and I mostly arrange/edit/and mix, with not too much mastering experience. However, I will frequently do a βquick masterβ while mixing, so my tracks are easier to listen to on earbuds, car speakers, etc. I will typically add some light compression on the master, EQ, and a limiter. My question for you all is: when I return to mixing, should I turn off the mastering effects I applied in the last session, or is it harmless to leave them on?
Best Ableton Master Bus plug-in I've come across -- it just f'ing rocks!
I ran into a weird issue of I2c not acting the way I was expecting. I am attempting to communicate on the I2C bus using the IPMB protocol. I hooked it up to a logic analyzer and sent a write command and to my surprise, I got a write back immediately afterwards from the device. It appears that myself and the BMC that I am talking to are both masters and slaves at the same time. Is there a specific name for this? Are there libraries that support this? How can I handle this in programming easily? Currently we are looking to use some sort of smallish Linux computer to talk to this. I am using a Raspberry PI, pico and STM32 to experiment now. Maybe I need to use two sets of I2C buses, one as slave and one as master?
In the logic analyzer I send:
write to 0x10 ack data: 0x00 0xE0 0x20 0x04 0x01 0xDB
And the data comes back as this:
write to 0x10 ack data: 0x04 0xDC 0x20 0x04 0x01 0x00 0x20 0x00 0x40 0x7B
From the documentation:
https://www.intel.com/content/www/us/en/servers/ipmi/ipmp-spec-v1-0.html
The IPMB uses a Request / Response protocol. A Request Message is issued to an intelligent device.
The device responds with a separate Response Message. Both Request Messages and Response
Messages are transmitted on the bus using I2C Master Write transfers. That is, a Request Message is
issued from an intelligent device acting as an I2C master, and is received by an Intelligent Device as an
I2C slave. The corresponding Response Message is issued from the responding intelligent device as an
I2C master, and is received by the request originator as an I2C slave.
I have found an implementation of this working with a AARDVARK i2c device and sure enough that is what they are doing. they write and immediately read.
https://youtu.be/yxmsP8mDva4
On this guys french house tutorial at the 33:30 mark he discusses a particular eq that will give it a french feel. He implies that this is perhaps a plug in version of a real eq that french producers were using back in the day.
Does anyone have any more info on this? Was there a particular eq mastering techniques that was used?
Pretty simple question: when making sure to keep the low end in mono, do you treat each track, or do you only do it on the master bus?
For example, in Ableton, there is the Utility device with the "Bass Mono" button. Would you put this on each track, or would you only put it on the master?
Bonus question: what is the typical max frequency you force to mono?
Thank you much!
Hi folks,
Iβm not a fan of iZotope Insight or the Waves PAZ analyzer UI.. so Iβm looking for an alternative plugin that analyzes freq spectrum, stereo field, RMS and peak gain.
What are you using?
So I am a musician and I am mixing and mastering my own stuff too. Genre is pop, rock and stuff like that.
I was just wondering how many of you mix with some sort of effects on the master bus? Like, to get an idea of how the finish product will sound and maybe even print the song with those effects on.
Also, do you put reverb on your master bus?
Hey guys, I've got a quick question. Can I think of a folder master track and a bus which has the tracks routed into it unrouted from the main master parent, as the same thing?
I often hear interesting textures that would need a lot of automation to recreate and really wish I could capture them on the fly, I'm on cubase 10.5 btw.
Lately I came up with this simple, but cool thing you could try. Iβm sure itβs not something new but I thought Iβd share!
Put a saturation / distortion plug in on the master bus, and set it so itβs very subtle. Automate it so you crank it a bit more in some areas of the song, like the choruses or a bridge that needs a push. This is a cool way to bring in more energy when you need too! The lows especially will saturate a bit more, giving the effect of more presence and brightening up the section. I like Soundtoys decapitator or radiator for this, the mix knob is great for the purpose!
In their Mix with the Master Webinar, Xavier described how their Master Bus was set during the production of Cross : a stack of 4 stock Cubase comps + 4 Cubase QFuzz (set to 0) set as follow: Fuzz/Comp/Fuzz/Comp/Fuzz/Comp/Fuzz/Comp
Since I still use Cubase SX3 nowadays, it wasn't a secret to me that the QFuzz was a major part of their sound (SebastiAn too), but I didn't expect something so extreme on the Master Bus.
I did try to mess around with it but, of course, didn't come up with something that sounds even close lol.
Does anybody tried this trick and got good results?
When mixing is it okay to pull down the master fader to stop clipping? None of my individual instruments are clipping but the overall output is. What I've been doing to counter this is pulling down the master bus fader and then increasing my computer volume, but this feels like bad practice.
Use case is a low-end recording/streaming setup in a church.
Setup is a Win10 laptop (link w/ specs below) running Vmix, receiving input from 1-2 handheld video cameras over hdmi, and stereo audio at 16/48 from a KT DN9630 over usb.
What happens is that once in a while, Vmix will sort of drop the audio - the input channel will still show signal and if you Solo the track, you can monitor it over headphones, but nothing gets sent to the Master Output unless you either close and re-open the template or delete and re-add the input device.
Most recently, this happened in the middle of recording and the video recording kept going while the audio just dropped out. (The audio file continued to record, but recorded silence)
The problem has happened at least a few times, but only very occasionally and sometimes without somebody monitoring it, so the amount of detail I can provide is limited.
Any ideas on what would cause this? My first thought was some kind of driver conflict or something, which could be remedied by embedding the audio signal into one of the camera feeds upstream of the computer, allowing us to ditch the DN9630 altogether. But other folks I've talked to have suggested that it's triggered by a cpu overload, which makes me wonder if the problem would still happen if we were pulling audio from the cameras instead of a separate device.
ETA: This is the laptop in question:
https://www.amazon.com/Acer-Display-i7-9750H-Keyboard-AN715-51-70TG/dp/B07QXT3WC6
Usually I will make maybe a 1-2 minute track and then I'll throw a few plugins and an outboard bus compressor on the master so I can export a decent master to send to some singers and rappers for feedback and to see if its something they want to use. When I go back to actually complete the track, I usually keep those plugins enabled on the master track and produce/mix through them. I'll usually do a lot of tweaking before I get the final mix finished though. Is this a good habit or should I disable them while I'm still working on the production part?
For reference, my master chain usually goes something like this:
TK Audio BC501(outboard bus compressor similar to an SSL G comp) > Soundtheory Gullfoss(kinda hard to explain, it's an algorithmic EQ that just makes everything sound nice and clear) > Slate Fresh Air(mid and high exciter for air) > Submission Audio Flatline(emulates clipping fancy converters) > Waves L1+ Ultramaximizer(used mostly as a peak limiter and to squeeze out a few extra LUFS)
I usually mix from about -8dbtp to -5dbtp and master to -1.5dbtp anywhere from -18 to -12 LUFS. Most of the gain in the master stage comes from Flatline. I only got that plugin this week so I'm still figuring out how to use it, so I always use L1+ after it in true peak mode just to make sure I'm not clipping. Also, I use the free Youlean Loudness Meter 2 plugin for LUFS, dynamic range, and true peak, followed by Sonarworks Reference 4(room/speaker calibration) which is always on my master no matter what.
Also, I mostly produce vinyl sample based hip hop instrumentals, kind of similar to the style of beats used by Griselda. I also do basic trap beats from time to time, but I've been finding them kind of boring to produce recently so I've kinda moved on from them.
What are your thoughts on this? Cheers
hey, I've bought a master chain preset and I can't figure out how to put the .ptxp file to my pro tools session? Please help me! :(
(Windows 10, Pro Tools 12.5)
Yes, I get it. You should avoid using any processing on your master channel. But how about tape saturation/emulation?
In the old days, producers would always record to tape before sending out the mix for the mastering engineer. So why is this is technique frowned upon nowadays?
There seems to be certain flavour of saturation that you only get when you apply the fx to the master. How would you guys go about this?
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.