So I've been asked to trigger an alarm condition when we see a sudden deviation in some data. The numbers themselves don't really matter and are constantly changing. Management has asked me to take a running average of the data over a small period of time and compare it to either the most recent single data point or a smaller running average.
I know I can do this by moving the data into a bunch of DINT tags and doing the math, but is there an instruction specifically designed for something like this? Seems like it would be a common thing so I'm hopeful.
*Edit* Woops, forgot to mention this is an RSLogix5000 project.