A) Marian Shelby. “The Fall of Rome.” Military Forces. n.p., 11 Mar 2009. Web.
14 Oct 2012.
Answer: hello your question is poorly written and I have been able to properly arrange them with the correct matching
answer
Static libraries : C
Dynamic link libraries: A
Using static libraries: B
Making some changes to DLL: D
Explanation:
Matching each term with its meaning
<u>Static Libraries </u> : Are attached to the application at the compile time using the Linker ( C )
<u>Dynamic link libraries</u> ( DLL ) : Is Loaded at runtime as applications need them ( A )
<u>Using static Libraries </u>: Makes your program files larger compared to using DLL ( B )
<u>Making some changes to DLL </u>: Does not require application using them to recompile ( D )
Answer:
The old cathode Ray tube technology was replaced by the less bulkier and more modern liquid crystal display and LED technology.
Explanation:
The old cathode ray tube uses the principle of electrical discharge in gas. Electrons moving through the gas, and deflected by magnetic fields, strike the screen, producing images and a small amount of X-rays. The tube required more space, and consumed more electricity, and was very bulky. The modern technologies are more compact and consume less power, and can been designed to be sleek and less bulky.
Answer: Probability/impact risk matrix
Explanation: Probability/impact risk matrix is type of matrix that defines the probability as well as impact that depicts whether the risk is low ,high or moderate.
Impact matrix is sort of tool which helps in conversion of any plan into a action. Probability matrix help in defining the chances in defining the risk.The positioning of the impact value of risk is plotted on the vertical axis and performance value on the horizontal axis.
Thus probability/impact matrix is the correct tool for the problem mentioned in the question.
They are several weaknesses or disadvantages of file processing
systems but the major weakness is data redundancy and inconsistency. By data
redundancy, I mean duplication of data. There are no better methods to validate
insertion of duplicate data in file systems. Data redundancy can also increase
the chance for errors.