The probability that a material flaw

 

PROBLEM 1(30pts) Let denote the probability that a material flaw is detected by a newly developed test, given that it is present. Let denote the probability that a material flaw is not detected, given that it is not present. Assume that in the field, the probability that a flaw is present is .
(a)(10pts) Find the values of the four joint probabilities . [NOTE 1: In doing so, do NOT use the fact that they sum to 1.0 to arrive at any of them. Use this fact to validate your answers. NOTE 2: I would suggest that to compute a given answer, you use the ‘vpa’ command to achieve accuracy to five decimal places. NOTE 3: Show each equation HERE, but make the computations in your Matlab code.]
Solution: [See code @1(a).]
(i): (ii):
(iii): (iv):
(v): Validation:
(b)(5pts) From your results in (a), compute the values of and to five decimal places.
Solution: [See code @1(b).
(c)(10pts): You are given , , and . Use only this information to run simulations of . To speed things up, first simulate , and initialize . Then use a ‘for’ loop to simulate each y-value corresponding to the associated x-value via the conditional probabilities and . Once you have the simulations (do not plot them), you can proceed to estimate each joint probability via the code given in part (c). This will allow you to validate your results in (a), should you wish. Finally, use them to arrive at estimates of the two conditional probabilities you computed in (b). [Be SURE to use the ‘vpa’ command for these.] Comment as to whether you think your code is correct, by commenting on how the estimates compare to the true values.
Solution: [See code @1(c).]
(d)(5pts): Suppose that we have , , and that the detection device is incorporated into 20 million vehicles. The detection of a fault entails a cost of $200 to address it, plus $50 to replace a faulty unit. Find the total expected cost of addressing (i) vehicles that actually have the fault, and (ii) vehicles that, in fact, do not have the fault.
Solution: [See code @1(d).]
PROBLEM 2(35pts) Consider the model . Here, is a white noise process and the roots of the polynomial are assumed to lie in the unit circle of the complex plane. It follows that for the proper initial conditions is a partial realization of the wss process .
(a)(6pts) Multiply the model by each of and take the expectation to arrive at 3 equations that involve the parameters and the autocorrelations .
Solution:
(b)(5pts) Suppose that you know . Rearrange the equations in (a) as , where .
Solution:
(c)(7pts) Suppose that you know Rearrange (a) into the form , where . Then, for , use it to find the verify that .
(d)(7pts) Obtaining an explicit form for is not easy. Hence, to compute the psd for , we will use the Wiener-Kinchine Theorem: where . Rather than computing this explicitly,
we will assume , and use Matlab to compute it. To this end, note the the discrete Fourier transform of any sequence is where . Using the Matlab command ‘fft(b)’ where will result in computing at m uniformly spaced frequencies . Now consider the sequence . Then the command abs(fft(b)).^2 will compute at n uniformly spaced frequencies . Carry out this procedure to obtain a plot of vs. , where and n=2000.
[Note: the fft command allows for such automatic ‘zero-padding’.]
Solution: [See code @ 2(d).)
(e)(10pts) Use to simulate values for . Note that is easy to simulate. In Example 2(b) of the Random Signals II notes we show that , where , and . Hence, is also easy to simulate. (i) Plot a simulation. (ii) Comment on whether the range of amplitudes is to be expected. (iii) Comment as to whether the structure is consistent with the psd in (d).
Solution: [See code @ 2(e).]

Sample Solution

regards to the osmosis of pieces into lumps. Mill operator recognizes pieces and lumps of data, the differentiation being that a piece is comprised of various pieces of data. It is fascinating regards to the osmosis of pieces into lumps. Mill operator recognizes pieces and lumps of data, the differentiation being that a piece is comprised of various pieces of data. It is fascinating to take note of that while there is a limited ability to recall lumps of data, how much pieces in every one of those lumps can change broadly (Miller, 1956). Anyway it’s anything but a straightforward instance of having the memorable option huge pieces right away, somewhat that as each piece turns out to be more natural, it very well may be acclimatized into a lump, which is then recollected itself. Recoding is the interaction by which individual pieces are ‘recoded’ and allocated to lumps. Consequently the ends that can be drawn from Miller’s unique work is that, while there is an acknowledged breaking point to the quantity of pi

This question has been answered.

Get Answer
WeCreativez WhatsApp Support
Our customer support team is here to answer your questions. Ask us anything!
👋 Hi, Welcome to Compliant Papers.