470,590 Members | 1,753 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 470,590 developers. It's quick & easy.

algorithm for detecting a limit of time-series numbers

Is there a well known and proven algorithm to find the limit of a set of points, which are time based metrics? I'm looking for an existing implementation, in order not to try and invent something which already exists.

This is what I have in mind:

Run over the numeric values of a series
1. Calculate L which is the top limit, and E, where 2*E is the convergent stripe around L.
- E is configurable
- Most points, P percentage, except of anomalies/noise, are below the top line of L+E.
- P is configurable, ranging at 90-95-99% range.
- A Linear line around the last X points (Linear Regression) has a minor slope of +/-S
- X and S are configurable

The algo uses a stored state of a previous maximum limit - max-L.
- In case L which was calculated in (a) is lower than max-L, it is considered a local max which is discarded
- If L is greater than max-L then it is considered a new max limit and stored as the new max-L

Apr 7 '19 #1
0 2604

Post your reply

Sign in to post your reply or Sign up for a free account.

Similar topics

8 posts views Thread by | last post: by
13 posts views Thread by Don | last post: by
1 post views Thread by pkochanek | last post: by
9 posts views Thread by asaguiar | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.