âSet your channel gain to 11-12 oâclock.â
Telling people to leave their trim-gain knobs at one physical rotation spot (or narrow range) is bad advice. That only worked for him because he was using test signals of known level.
People need to be trying to stay out of the top two meter LEDs on all this gear. Second-to-top LED is really only for accidents, since once youâre in it you have no way of knowing how far from clip you are. I realize in this case if the current second-to-top LED on the GO encompasses a whole 18dB, thatâs asking a lot. If youâre tempted to go into the meter zero on the GO, frequently back the level off and test how deep youâre into it. Obviously InMusic changing the GO meter zero to a higher value and the LED encompassing a smaller range (say, some amount 3 to 9dB⌠take your pick) like they do on Numark digital mixers with similar meters would be less tempting to go into the second-to-top. Then you gotta alter the scaling of the LEDs below. I donât think anyone will complain about the lower LEDsâ markings below zero no longer being accurate.
âif you increase the channel gain past 1.30 oâclock (see settings in picture below and also the harmonic distortion of the 110 Hz signal in the FFT plot) there WILL be significant harmonic distortion even if the blue VU LED is not lit and there is no hard clipping and I suspect that this is what the Germans did. But WHY on earth would you do this???â
Thatâs strange, and not what the Germans did, but that the GO is doing that. Is there a defeatable compressor/limiter turned on in the GO? That should definitely be off when running tests. Did he have the tone controlsâ isolator mode on? Tell him to switch it to EQ and run the tests again. Isolators are summed crossover-style filters and add a little group delay phase distortion. The X1800 has a nifty bypass for the iso (at least it did the last firmware I checked it on) when the tone controls for a channel are centered, but I donât know that the GO has that. The X1800 has the top meter LEDs as -1dBFS. So if itâs also not the iso heâs seeing, either, the other explanations are something like inter-sample clipping during oversampling⌠or even the gain structure somewhere in the virtual signal path (itâs DSP stuff, after all) being even worse than anticipated⌠or there could be a design screw up in the GO analog output stages⌠or an issue even on his own ADC stage. Did he ensure he was not clipping his interfaceâs inputs? Some guy on YouTube did a measurements comparison of a Pioneer DJM and a Xone and was clearly clipping his interface inputs and claiming one of the pieces had a problem with its levels, and that was done for one of the major online DJ rags!
He also didnât say anything about whether or not thereâs ultrasonic garbage past the roll off on the GO. For that matter, he doesnât appear to have measured the frequency response at all.
Iâm not sure what âlooks fineâ means. A usual IMD test through an analog piece of gear is going to send the raw test signal to the gear and then analyze the output from said gear in comparison. Pioneer CDJs from the mk3-on seem to produce the exact same amount of IMD present in the original test signal at zero pitch when using the SPDIF. The Prime standalone playersâ digital audio processing adds 100X the intermodulation distortion of the original signal in RMAA. The comparison to the original signal measurement is what matters: how much more IMD is there compared to original.
A couple single tones also donât tell you how low the nonlinear distortion harmonics are. Hint, theyâre nonlinear because their âmultipleâ changes as the fundamental changes. They expand or squeeze as the sweep progresses. You have to do a complete sweep and look at the real time FFT. This can be recorded in a video. His single pic does demonstrate the possibility the GOâs overall harmonic distortion from audio processing is lower than the last time I tested the standalone players, though, but doesnât fully demonstrate it.
If the GOâs IMD and nonlinear distortion harmonics are all lower than the standalone players, then it could possibly be the result of GO simply doing all the processing at 44.1, 48, or some near multiple on the unit instead of doing it at 24/96 or some huge multiple of 96hz that the standalone players seem to be and not having to bump it back down for the SPDIFs. Since the GO seems to only support up to 48, right, and has no SPDIF out, that would be one explanation if thatâs the case. That would also mean that InMusic should be easily able to give us the ability to either change the rate that the standalone playersâ are using or let the layer automatically change its rate to match the file, since the GO would be proof theyâve demonstrated they can run with less SRC mucking things up. So that would definitely be a very good sign for all Prime users⌠sort of light on the horizon.