When simulating sensitivities with enable_variable_scaling set to True, the returned sensitivity, e.g. dx1/dk1 seems to be scaled, but only for the nominal value of x1 and not with the nominal value of k1.
I attach an example where a nominal value for x1 is set to 2 and sensitivities is calculated for parameter k1. Without scaling the sensitivity (dx1/dk1) is -12.34, with scaling it is -6.33 (the ratio is 1.95). It is strange though that nominal values set for k1 does not effect the value.
Is this scaling behaviour intended?