Answer:
The right to refuse work that could affect their health and safety and that of others.
The activity of the sample when it was shipped from the manufacturer is 4.54 mCi
<h3>How to determine the number of half-lives that has elapsed </h3>
From the question given above, the following data were obtained:
- Time (t) = 48 hours
- Half-life (t½) = 14.28 days = 14.28 × 24 = 342.72 hours
- Number of half-lives (n) =?
n = t / t½
n = 48 / 342.72
n = 0.14
<h3>How to determine the activity of the sample during shipping </h3>
- Number of half-lives (n) = 0.14
- Original activity (N₀) = 5.0 mCi
- Activity remaining (N) =?
N = N₀ / 2ⁿ
N = 5 / 2^0.14
N = 4.54 mCi
Thus, the activity of the sample during shipping is 4.54 mCi
Learn more about half life:
brainly.com/question/2674699
The reason why Br has a greater magnitude of electron affinity than that of I is that there is a greater attraction between an added electron and the nucleus in Br than in I.
In the periodic table, there are trends that increase down the group and across the period. Electron affinity is a trend that increases across the period but decreases down the group.
Recall that the ability of an atom to accept an electron depends on the size of the atom. The smaller the atom, the greater the attraction between an added electron and the nucleus.
Since Br is smaller than I, there is a greater attraction between an added electron and the nucleus in Br than in I which explains why Br has a greater magnitude of electron affinity than I.
Learn more: brainly.com/question/17696329
Correct option:
Entropy is used to calculate information gain.
What is entropy?
- Entropy is the measure of data's uncertainty or randomness; the greater the randomness, the higher the entropy. Entropy is used by information gain to make choices. Information will increase if the entropy decreases.
- Decision trees and random forests leverage information gained to determine the appropriate split. Therefore, the split will be better and the entropy will lower the more information gained.
- Information gain is calculated by comparing the entropy of a dataset before and after a split.
- Entropy is a way to quantify data uncertainty. The goal is to maximize information gain while minimizing entropy. The method prioritizes the feature with the most information, which is utilized to train the model.
- Entropy is actually used when you use information gain.
Learn more about entropy here,
brainly.com/question/22145368
#SPJ4