By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,414 Members | 2,919 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,414 IT Pros & Developers. It's quick & easy.

algorithm for detecting a limit of time-series numbers

P: 1
Is there a well known and proven algorithm to find the limit of a set of points, which are time based metrics? I'm looking for an existing implementation, in order not to try and invent something which already exists.

This is what I have in mind:

Run over the numeric values of a series
1. Calculate L which is the top limit, and E, where 2*E is the convergent stripe around L.
- E is configurable
- Most points, P percentage, except of anomalies/noise, are below the top line of L+E.
- P is configurable, ranging at 90-95-99% range.
- A Linear line around the last X points (Linear Regression) has a minor slope of +/-S
- X and S are configurable

The algo uses a stored state of a previous maximum limit - max-L.
- In case L which was calculated in (a) is lower than max-L, it is considered a local max which is discarded
- If L is greater than max-L then it is considered a new max limit and stored as the new max-L

Apr 7 '19 #1
Share this question for a faster answer!
Share on Google+

Post your reply

Sign in to post your reply or Sign up for a free account.