Answer:
<em>Given Data:</em>
<em>myBytes BYTE 10h, 20h, 30h, 40h </em>
<em>myWords WORD 3 DUP(?), 2000h </em>
<em>myString BYTE "ABCDE"</em>
<em />
Based on the data that we are given we can conclude that:
(a). a. EAX = 1
b. EAX = 4
c. EAX = 4
d. EAX = 2
e. EAX = 4
f. EAX = 8
g. EAX = 5
Complete Question:
Recall that with the CSMA/CD protocol, the adapter waits K. 512 bit times after a collision, where K is drawn randomly. a. For first collision, if K=100, how long does the adapter wait until sensing the channel again for a 1 Mbps broadcast channel? For a 10 Mbps broadcast channel?
Answer:
a) 51.2 msec. b) 5.12 msec
Explanation:
If K=100, the time that the adapter must wait until sensing a channel after detecting a first collision, is given by the following expression:
The bit time, is just the inverse of the channel bandwidh, expressed in bits per second, so for the two instances posed by the question, we have:
a) BW = 1 Mbps = 10⁶ bps
⇒ Tw = 100*512*(1/10⁶) bps = 51.2*10⁻³ sec. = 51.2 msec
b) BW = 10 Mbps = 10⁷ bps
⇒ Tw = 100*512*(1/10⁷) bps = 5.12*10⁻³ sec. = 5.12 msec
Answer:
D
Explanation:
the answer is D because it does exactly what the problem says.
Answer:
C
Explanation:
I don't have one but I hope you get an A.
I hope this helped