EGR 103/Fall 2021/Lab 9

From PrattWiki
Revision as of 13:02, 2 November 2021 by DukeEgr93 (talk | contribs) (Created page with "The following document is meant as an outline of what is covered in this assignment. == Preparing For Lab == Note: Going forward, I will refer to the Sakai / Resources Folder...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

The following document is meant as an outline of what is covered in this assignment.

Preparing For Lab

Note: Going forward, I will refer to the Sakai / Resources Folder / ChapraPythonified / Chapter 14 folder as "CP14"

  • Read the "Part Four" introduction in Chapra, pp. 343-345. In section 4.2, note that we have talked / will talk about Chapters 14-18 but we will skip Chapter 16.
  • Read the introduction to Chapter 14.
  • Read 14.1
    • The Python version of the code in 14.1.3 is in CP14 and named Chapra_14_1_3.py. You will need the Table14p03.txt data file for it to work. Notice in particular that Numpy's variance and standard deviation functions require a keyword argument to match MATLAB's default cases.
  • We have covered the Python versions of 14.2. Example 14.2 and 14.3 are CP14 and named Chapra_ex_14_2.py and Chapra_ex_14_3.py.
  • Skim 14.3. We did the mathematical proof in class. See General_Linear_Regression for specific and general versions but do not get too hung up on the math.
  • Skip or skim 14.4. Linearization is an important concept but outside of the scope of this lab. Figure 14.13 would be a good one to look at though as it shows how plotting transformed variables in different ways might lead to an understanding of the underlying mathematical relationship.
  • Skip 14.5.1. The linergr is a way-too-specific version of what we will eventually use.
  • Read 14.5.2. Note that the differences between Python and MATLAB here will be that we need to set up arrays differently (MATLAB can just use bracketed numbers separated by spaces - Python needs np.array([LIST]) and the np in front of certain commands.
  • Skip or skim the case study.
  • Go to Python:Fitting and specifically:
    • Read the intro and note that to do the demos you will need a Cantilever.dat file with the contents shown
    • Take a look at the common command references
    • Look through the common code; for this lab there will be a special version of it called lab9_common.py which just skips the get_beam_data() function.
      • Note that the make_plot() command works great for generic plots but for the plots in the lab you will generally need to write your own specific code.
    • Take a look at the Polynomial Fitting code and make sure you completely understand each part of it. The parts with a white background are going to be common to all the demonstrations while the code with a yellow background will be what changes each time.
    • Take a look at how to make changes to Polynomial models Python:Fitting#Polynomial and General Linear models Python:Fitting#General_Linear
  • You can now do Chapra 14.5 completely.
  • Skim 15.1. Although you still have problems to do in Chapter 14, the way you are going to do general linear regression is slightly different from what is presented in Chapter 14.
  • Read 15.2. The key is to do fitting with multiple independent variables, those variables need to be column vectors. To do graphs, those variables need to be 2D arrays. See the 2D example in Python:Fitting.
  • Skim 15.3. We did this in class; once again, see General_Linear_Regression.
  • I am working on Pythonifying Chapter 15 - stay tuned!
  • Take a look at the General Linear Regression code and make sure you understand what each line does. The np.block() command is the key element in producing the $$\Phi$$ matrix needed for the least squares fit method of the linear algebra package in numpy.
  • You can now do Chapra 14.7, 14.27, 15.10, 15.10, and 15.5-15.7 alt completely.

Submitting Work

There are Connect and Lab Assignment parts for almost every problem.

    • Be sure to carefully read each problem - sometimes Connect will change a number or a prompt slightly from the book problem. Though you may work in small groups for the Connect problems, you will need to be sure to answer *your specific* problem.
    • Once you get an assignment 100% correct, you will be able to look at the assignments and the explanations for the answers. Note: if there is coding involved in an answer, the solution will be presented as MATLAB code.
    • Use fig.set_size_inches(6, 4, forward=True)} to make your graphs all the same size.
    • Be sure to use tight layout and save as a .png (graphics) file, not a .eps file.


Typographical Errors

None yet!

Specific Problems

  • Be sure to put the appropriate version of the honor code -- if you use the examples from Pundit, the original author is either DukeEgr93 or Michael R. Gustafson II depending on how you want to cite things.

Chapra 14.5

Chapra 14.7

  • See Python:Fitting#General_Linear_Regression
  • Be sure to calculate the $$R$$ value. Note that it is not the same as the slope of the line you would get if you try to model $$p$$ as a function of $$T$$.

Chapra 14.27

Chapra 15.10

Chapra 15.10 Alternate

Chapra 15.5

  • Be sure to just use the first column of data where $$c=0$$.

Chapra 15.6

  • The labels and ticks and such are given in the problem statement.
  • Note that you will be both doing statistics with the estimates and making a graph based on a calculation with them. The former needs a column and the latter needs a matrix.
  • Don't forget to calculate the estimate and the error for it!
  • You can copy and paste the coefficient values - just truncate them after four significant digits. They can be in scientific notation or floating-point.

Chapra 15.7

  • The labels and ticks and such are the same as 15.6 so re-use that code!
  • There are very minor modifications between this and the previous script.
  • Don't forget to calculate the estimate and the error for it!
  • You can copy and paste the coefficient values - just truncate them after four significant digits. They can be in scientific notation or floating-point.

General Concepts

General Linear Regression