COMPUTER PROBLEM-SOLVING IN ENGINEERING AND COMPUTER SCIENCE X In this exercise you will use least-squares curve fitting to develop two equations to model the data given. Using these equations, we will predict the function values for two inputs and evaluate the prediction made by each of the curves and linear interpolation. PART A 2 8 PROBLEM-SOLVING EXERCISE #3 ESTIMATING AND PREDICTING UNKNOWNS 16 11 20 17 25 20 28 26 29 31 32 EGR 1400 WINTER 2023 36 34 41 36 46 38 Using the 8 non-shaded values above, find ao and as for the least squares linear regression. We will save the shaded values for our test data, that is, data points that are known but we will not include in the information used to make a representative curve. We will use these points to see how close our curve fit is to predicting actual values that were not used to derive the curve. Compute the overall squared-error. Write the completed polynomial. PART B Using the 8 non-shaded values from part A, find ao, ai, and a2 for a parabolic least squares regression (polynomial of degree 2). Use MS Excel to solve for these coefficients. Compute the overall squared-error. Write the completed polynomial. Include a printout of your Excel spreadsheet. PART C On two separate graphs, plot the non-shaded data points and show the resulting curves from Part A and Part B; a separate graph for each curve. Use graph paper.
Part A.
The python code for the part A is given below with self-explanatory embedded comments:
import numpy as np
from scipy.optimize import curve_fit
from scipy.interpolate import interp1d
# Define the functions to fit
def linear_func(x, a, b):
return a*x + b
def quadratic_func(x, a, b, c):
return a*x**2 + b*x + c
# Fit the cubic function to the data
def cubic_func(x, a, b, c, d):
return a*x**3 + b*x**2 + c*x + d
# Define the train and test data
X_train = np.array([2, 7, 11, 20, 26, 31, 41, 46])
Y_train = np.array([8, 16, 20, 28, 29, 32, 36, 38])
X_test = np.array([17, 36])
Y_test = np.array([25, 34])
# Fit the linear, quadratic and cubic functions to the data
fit_linear, _ = curve_fit(linear_func, X_train, Y_train)
print('Linear function parameters:', fit_linear)
fit_quadratic, _ = curve_fit(quadratic_func, X_train, Y_train)
print('Quadratic function parameters:', fit_quadratic)
fit_cubic, _ = curve_fit(cubic_func, X_train, Y_train)
print('Cubic function parameters:', fit_cubic)
# Evaluate the fitted functions on the test data
Y_pred_linear = linear_func(X_test, *fit_linear)
Y_pred_quadratic = quadratic_func(X_test, *fit_quadratic)
Y_pred_cubic = cubic_func(X_test, *fit_cubic)
# Calculate the squared-errors for each method
se_linear = np.sum((Y_pred_linear - Y_test)**2)
se_quadratic = np.sum((Y_pred_quadratic - Y_test)**2)
se_cubic = np.sum((Y_pred_cubic - Y_test)**2)
# Print the result
print("\n")
print(f"Squared-error for linear function: {se_linear}")
print(f"Squared-error for quadratic function: {se_quadratic}")
print(f"Squared-error for cubic function: {se_cubic}")
# Print the completed polynomials
print("\n")
print(f"Linear function: {fit_linear[0]:.5f}x + {fit_linear[1]:.5f}")
print(f"Quadratic function: {fit_quadratic[0]:.5f}x^2 + {fit_quadratic[1]:.5f}x + {fit_quadratic[2]:.5f}")
print(f"Cubic function: {fit_cubic[0]:.5f}x^3 + {fit_cubic[1]:.5f}x^2 + {fit_cubic[2]:.5f}x + {fit_cubic[3]:.5f}")
Trending now
This is a popular solution!
Step by step
Solved in 8 steps with 6 images