Is there a well known and proven algorithm to find the limit of a set of points, which are time based metrics? I'm looking for an existing implementation, in order not to try and invent something which already exists.

This is what I have in mind:

Run over the numeric values of a series

1. Calculate L which is the top limit, and E, where 2*E is the convergent stripe around L.

- E is configurable

- Most points, P percentage, except of anomalies/noise, are below the top line of L+E.

- P is configurable, ranging at 90-95-99% range.

- A Linear line around the last X points (Linear Regression) has a minor slope of +/-S

- X and S are configurable

The algo uses a stored state of a previous maximum limit - max-L.

- In case L which was calculated in (a) is lower than max-L, it is considered a local max which is discarded

- If L is greater than max-L then it is considered a new max limit and stored as the new max-L