I'm attempting to calculate accurate displacements from measured accelerometer data. I've tried using both the 'Second Overall Level' converted to diplacement in Online Processing, and using Throughput Processing. I've also tried to use the double integration option in the Time Signal Calculator, feeding my Throughput Accel time signals back into the calculator.
I'm getting wildly high numbers for displacement either way, especially while using the Calculator, and the numbers aren't even close to matching.
Is there some trick to double-integration that I'm missing? Perhaps this isn't feasible while using ICP accels? Has anyone used this feature in Test.Lab with any degree of accuracy/confidence in the data?
Solved! Go to Solution.
In order to integrate time histories correctly a little bit of care is needed. Any drift in the original signal can cause integration problems so we use a DETREND_AC command to remove this drift and any low frequency components. Because some integration methods introduce high frequency components (a sawtooth effect on the result) we normally suggest upsampling before the integration and then downsampling later. We then use a HP Filter to remove the constants of integration. So in order to get meaningful results we suggest DETREND_AC, RESAMPLING up by a factor of 4, INTEGRATE or DOUBLEINTEGRATE, RESAMPLING down by a factor of 4, HP_FILTER.
An FAQ is attached. It is also available on the Global Technical Assistance Center (GTAC) website under the solution center.
With respect to the Overall Level question, the Overall level includes frequencies from 0 Hz to Bandwidth. Those low frequencies show up as really large numbers because their integration in the frequency domain is 1 / wj and when frequency is close to 0, that integration is infinity. There are two workarounds I can suggest:
Excellent! Thank you for the quick response, and detailed explanations.
I will run my data through using both methods (Time Data Calculator with extra signal conditioning, and Overall Level via a Frequency Section in the Time Data Processing), and report back once I have a chance to analyze the data.