Get the Least-squares fit of a polynomial to data in Python

To get the least-squares fit of a polynomial to data in Python, we use numpy.polynomial.polynomial.polyfit(). This function finds the polynomial coefficients that best fit the given data points using the method of least squares.

Syntax

numpy.polynomial.polynomial.polyfit(x, y, deg, rcond=None, full=False, w=None)

Parameters

  • x ? The x-coordinates of the sample points
  • y ? The y-coordinates of the sample points
  • deg ? Degree of the fitting polynomial
  • rcond ? Relative condition number (default: len(x)*eps)
  • full ? If True, returns diagnostic information (default: False)
  • w ? Weights for data points (default: None)

Return Value

Returns polynomial coefficients ordered from low to high degree. When full=True, also returns residuals, rank, singular values, and condition number.

Example

Let's fit a cubic polynomial to noisy data ?

import numpy as np
from numpy.polynomial import polynomial as P

# Create x-coordinates
x = np.linspace(-1, 1, 51)
print("X coordinates:")
print(x)

# Create y-coordinates with noise
y = x**3 - x + np.random.randn(len(x))
print("\nY coordinates:")
print(y)

# Fit a 3rd degree polynomial
coefficients, stats = P.polyfit(x, y, 3, full=True)
print("\nPolynomial coefficients (low to high degree):")
print(coefficients)
print("\nDiagnostic statistics:")
print(stats)
X coordinates:
[-1.   -0.96 -0.92 -0.88 -0.84 -0.8  -0.76 -0.72 -0.68 -0.64 -0.6  -0.56
 -0.52 -0.48 -0.44 -0.4  -0.36 -0.32 -0.28 -0.24 -0.2  -0.16 -0.12 -0.08
 -0.04  0.    0.04  0.08  0.12  0.16  0.2   0.24  0.28  0.32  0.36  0.4
  0.44  0.48  0.52  0.56  0.6   0.64  0.68  0.72  0.76  0.8   0.84  0.88
  0.92  0.96  1.  ]

Y coordinates:
[ 0.5   -0.2   -1.8   -1.4   -0.1    0.6    0.3   -0.5   -1.2    0.8
  0.2    0.9    0.1    0.1   -1.4    2.0   -0.1    1.7    0.7    0.2
  0.7    0.8   -1.2    1.5    1.3   -2.5   -0.3   -1.2    0.5   -0.5
  0.5   -0.1   -2.7   -0.5    1.5   -2.4   -1.9   -1.4   -1.2   -1.6
 -0.8    1.6   -0.5   -0.9   -0.3   -0.1    0.6    1.1   -2.3    2.0]

Polynomial coefficients (low to high degree):
[-0.17  -1.84   0.09   2.39]

Diagnostic statistics:
[array([60.44]), 4, array([1.38, 1.32, 0.50, 0.29]), 1.13e-14]

Understanding the Results

The coefficients represent the polynomial: -0.17 - 1.84x + 0.09x² + 2.39x³

Using Weights

You can apply weights to give more importance to certain data points ?

import numpy as np
from numpy.polynomial import polynomial as P

x = np.array([1, 2, 3, 4, 5])
y = np.array([2, 3, 5, 7, 11])
weights = np.array([1, 1, 2, 2, 3])  # Higher weights for later points

# Fit with weights
coefficients_weighted = P.polyfit(x, y, 2, w=weights)
print("Weighted polynomial coefficients:")
print(coefficients_weighted)

# Fit without weights
coefficients_unweighted = P.polyfit(x, y, 2)
print("\nUnweighted polynomial coefficients:")
print(coefficients_unweighted)
Weighted polynomial coefficients:
[ 0.89  0.43  0.39]

Unweighted polynomial coefficients:
[ 0.8   0.6   0.3 ]

Conclusion

Use numpy.polynomial.polynomial.polyfit() to find the best polynomial fit to your data. The function returns coefficients in ascending degree order and can provide diagnostic information when full=True.

Updated on: 2026-03-26T19:41:21+05:30

8K+ Views

Kickstart Your Career

Get certified by completing the course

Get Started
Advertisements