??? 03/14/12 10:39 Read: times |
#186655 - Not what he (or me) have asked Responding to: ???'s previous message |
Yagnesh Mehta said:
Hi mr. Dabir,
as per previuse post 10bit : 360/1024 = 0.35' is sufficient for me. 12bit : 360/4096 = 0.087' that much accusary i think not require. That was not what he asked. If you had done your home-work and played around with the formulas, you would know that you do not need 1024 samples/period if the input signal is pure sine wave. You don't need 512 samples per period either. Perform the calculations and you would know where the errors from sampling rate would be much smaller than ADC resolution... The questions you must ask is? - the precision on your measuremetns. Max 10% error? Max 1% error? Max 0.1% error? Max 0.001% error? - the bandwidth/quality of your signal. Is it pure 50Hz? Or does it contain significant amounts of overtones? We just can not tell you how many bits you need for ADC or how many samples per second or per period. Because we do not know what the signal looks like. Do you need to capture the signal with full bandwidth up to 400Hz? 1kHz? 4kHz? The amount of distortion affects the sampling rate (and also ADC resolution) required to manage 1% error. Or 0.1% error. Or whatever you need. And errors are accumulative. So errors from your signal conditioning will combine with errors from the ADC sampling rate. And with errors from ADC resolution. And with errors from voltage reference. If you buy a multimeter, you must have a view on what you require. Max 1% error for "normal" mains power? Max 1% error even for heavily distorted mains power? If you had spent a bit of time with Google, you would be able to find information about requirements to manage different precision for true rms power meters depending on signal and requirements. But formulas or tables are not meaningful if you don't even know what your requirements are. |