Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2020 October 19

From Wikipedia, the free encyclopedia
Mathematics desk
< October 18 << Sep | October | Nov >> Current desk >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


October 19

[edit]

Linear estimation of a local minimum from discrete data

[edit]

I have an array of data — say, six (x,y) values — characterized by a minimum y value in the middle of the set, increasing toward the ends of the x range. (The x,y values in this case come from a volatility smile, which I have characterized as a superposition of two hyperbolic functions, but that isn't important.) Basically, I have discrete points on a smooth curve, and I don't want to know or care what the underlying curve is.

Is there a quick linear method for approximating the value of x that corresponds to the minimum of the smooth curve, without resorting to spline fits, other polynomial fits, or fitting my own data model (which doesn't have a closed form solution)?

For example, I can eyeball it easily enough:

*



    *             *

              *
        *_ *
         ^
       minimum
     likely here

I'm just wondering if there's something I can do, perhaps with a weighted average of the slopes of each segment, that might give me an approximation of the minimum without curve-fitting. Somebody must have figured this out. Is there a technique for this? ~Anachronist (talk) 06:46, 19 October 2020 (UTC)[reply]

@Anachronist: Simply calculating the argmin from the data directly should suffice, right? This linear-time procedure will take less time than anything else I could suggest here. Otherwise, your answer will depend on what kind of smooth curve you select; you'd need to essentially find the root of the derivative and numerical differentiation really only works here with some interpolation of some sort.--Jasper Deng (talk) 07:16, 19 October 2020 (UTC)[reply]
You can fit a quadratic curve (or, for that matter, any polynomial of a fixed degree) in linear time. If you give the data points weights for some between 0 and 1, the higher values – presumably farther away from the minimum – get a lower weight, which is useful if the shape of the curve is poorly approximated by a parabola; that in the example looks more cuspy. (This method of using weights is insensitive to translations but sensitive to scaling of the values.) It will work poorly, though, if the curve has relatively flat ends. You can alternatively use a sliding window of a fixed width, say with indices to keep track of a data segment of length . The -th data point is stored at position After detecting the minimum in the stream of data à la JD, keep copying to the window for more data points; then stop. The discrete minimum is in the middle of the segment of the retained data points. Then use any method of curve fitting to find the minimum.  --Lambiam 08:27, 19 October 2020 (UTC)[reply]
@Jasper Deng: If you're suggesting that argmin is simply the minimum-value data point, no, that isn't what I need. In my crude illustration, I deliberately put two points (asterisks) at the same y-level, with different slopes on the left and right, to suggest that the minimum of the curve isn't right in the middle of those two lowest points, nor is it either one or the other point if the points are unequal by a tiny amount. In the illustration, the minimum (the underscore character) is closer to the left low point.
I guess I could restate the problem this way:
Given:
  • an unknown asymmetric curve having a first derivative that increases smoothly from negative to positive in an unknown non-linear way,
  • a sparse sampling of known (x,y) data points on that curve, equally spaced along the x axis,
  • a minimum y value in the range of (x,y) samples available,
what approach would result in a quick approximation of the x location (not the value) of the minimum of the curve?
@Lambiam: The best fit curve for data points in a volatility smile appears to be a skewed hyperbola. I haven't found an efficient way to fit such a thing without resorting iterative techniques. You've given me an idea with sliding windows and linear interpolation though. I need to think about it. ~Anachronist (talk) 18:56, 19 October 2020 (UTC)[reply]
This is closely analogous (though the other way up) to estimating the position of the mode when data have been classified, the construction involving a pair of crossed straight lines at the top of the modal class bar of the drawn histogram. A simple estimate, which you would have to judge the adequacy of, comes from projecting inwards the straight lines given by joining the data points closest to the minimum to the respective next ones moving outwards. → 2A00:23C6:AA08:E500:40DF:4AE9:718D:E2EA (talk) 13:14, 20 October 2020 (UTC)[reply]
If the derivative passes smoothly through zero, the curve can locally, around the minimum, be approximated by a parabola. If the spacing of the sample values of is so narrow that the segment involving the four points nearest to the sample minimum is very close to a parabola, then the -value of the intersection of these lines is very close to the -value of the sample minimum. So, if the curve is locally an exact parabola of the form and the data points have -values the value found is , so it is not a continuous function of . Obviously, the method is perfectly suited for curves that are locally close to  --Lambiam 22:18, 20 October 2020 (UTC)[reply]
A general equation for a hyperbola is given by The asymptotes are the pair of lines defined by and It has five degrees of freedom, which means a given hyperbola is uniquely defined by any five points lying on that hyperbola. If the curve is indeed hyperbolic, the five parameters can be determined by the five data points around the sample minimum. There are two solutions; take the one with Sanity check: you should now have and Warning: there is a risk of instability if the data points do not lie neatly on a hyperbolic curve. It is safer to use a larger window (say seven data points) and find a best least-squares fit.  --Lambiam 10:04, 21 October 2020 (UTC)[reply]
Note that for five random points the conic that fits them may be an ellipse, or if it's a hyperbola then the points might be on different branches, so some care will be needed to ensure this doesn't happen. Assuming the slope is really increasing, in other words that the curve is concave up, then the second issue should never occur. You can tell if the first one occurs by checking the discriminate of the quadratic terms. (If the conic has the form Ax2+Bxy+Cy2+Dx+Ey+F=0 then the discriminant is B2-4AC.) The conic will have two horizontal tangents; if it's a hyperbola then you want the upper one (assuming the slopes of the asymptotes have opposite signs) and if it's an ellipse then you want the lower one. If there's any uncertainty in your data then you might consider sticking with a parabolic model to avoid Overfitting. The algebra involved with the parabolic model is much simpler as well, so if you don't need much precision then it may more suited. --RDBury (talk) 12:30, 21 October 2020 (UTC)[reply]

The problem here is that the hyperbola is skewed, with a steeper curve on one side of the minimum than the other.

Anyway, I think I figured out a linear approximation without resorting to curve-fitting, which seems to work for an underlying assumption of a smoothly increasing first derivative that starts out negative and transitions to positive. The anonymous contribution from User:2A00:23C6:AA08:E500:40DF:4AE9:718D:E2EA had the right idea. To find the location of the minimum of my curve, given only discrete points on it, and without trying to fit an arbitrary curve to it, I do this:

  1. Find the two lowest points.
  2. For the point on the left, draw a line through that point and the point on its left.
  3. For the point on the right, draw a line through that point and the point on its right.
  4. Where these two lines intersect approximates the x-value of the minimum (but not the y-value). Done.

This seems to handle skewed hyperbolas well. If the slope on the left is steeper than the slope on the right, the location of the minimum is nearer to the left point of the two lowest points. ~Anachronist (talk) 05:50, 23 October 2020 (UTC)[reply]

@Anachronist: That doesn't sound like a good way to take into consideration data other than the middle four points. This doesn't seem like a robust statistic to me in the same way that the min and max of a data set are both not robust statistics (sampling does a poor job of estimating them). Note that there is one and only one hyperbola with this property for aeach given set of points. --Jasper Deng (talk) 07:35, 23 October 2020 (UTC)[reply]
(a) The equation I gave above (at 10:04, 21 October 2020 UTC) describes skewed hyperbolas as well. The left and right slopes are given by and
(b) I expect that you will find that the intersection method generally gives a result that is suspiciously close to the -value halfway those of the lowest two points. If the four points involved happen to lie precisely on a parabola, you will get the exact halfway value. For example, take These four points lie almost on the parabola with equation The lowest two points are at and The intersection procedure results in Yet, clearly, the minimum should be just above  --Lambiam 08:51, 23 October 2020 (UTC)[reply]
To drive these points home, take the hyperbola defined by The concave-up branch is then It attains its minimum at yet the intersection method results in , closer to the halfway point between the lowest two points when sampling at integral -values.  --Lambiam 11:29, 23 October 2020 (UTC)[reply]