The Formula y[x_] = y0 + (x)^n A E^(-(x - x0)^2/b^2) is a modified Maxwellian equation used to calculate the "Time of Flight" (Velocity, Intensity) of a pulsed laser deposition plume at a specific distance from the origin; at this case it is 4 cm from the origin. I would like to determine the standard error for the variable, 'n', at single observations. I need this because this exponent is traditionally fixed to a certain integer, i.e. 1, -1, 2, -2, but in my fittings there are instances where n can reach -17 and it varies at each distance... so I would like to be able to see what regions of my TOF data (if any) are producing the largest errors in my n value. Is there any to produce the results in Mathematica that I am looking for?
Essentially I am looking for the equivalent of the NonlinearModelFit property, "SinglePredictionErrors", but for a single variable, not a predicted response.
I will attach a Notebook with an example segment of data. It is standalone, so there is nothing needing imported. As you will see, I included an output of the SinglePredictionErrors... but that is meaningless to me. I would like to produce a similar plot of the 'n' error at single observations.
Attachment: FailuresBlank.nb, URL: ,