Answer:
C) 712 KJ/mol
Explanation:
- ΔH°r = Σ Eb broken - Σ Eb formed
- 1/2Br2(g) + 3/2F2(g) → BrF3(g)
∴ ΔH°r = - 384 KJ/mol
∴ Br2 Eb = 193 KJ/mol
∴ F2 Eb = 154 KJ/mol
⇒ Σ Eb broken = (1/2)(Br-Br) + (3/2)(F-F)
⇒ Σ Eb broken = (1/2)(193 KJ/mol) + (3/2)(154 KJ/mol) = 327.5 KJ/mol
∴ Eb formed: Br-F
⇒ Σ Eb formed (Br-F) = Σ Eb broken - ΔH°r
⇒ Eb (Br-F) = 327.5 KJ/mol - ( - 384 KJ/mol )
⇒ Eb Br-F = 327.5 KJ/mol + 384 KJ/mol = 711.5 KJ/mol ≅ 712 KJ/mol
The molecular weight of LiOH would be 23.95 g/mol, so the amount of LiOH in mol would be: 1.64g/(<span>23.95 g/mol)= 0.069 mol
The reaction of LiOH with HCl would be:
</span><span>HCl + LiOH = H2O + LiCl
The coefficient of LiOH:HCL is 1:1 so you need the same amount of HCl to neutralize LiOH.
HCl= LiOH
volume* 0.15M= </span>0.069 mol
volume= 0.069 mol/ (0.15 mol/ 1000ml)
volume= 459.29 ml
Answer:
Explanation:
2N₂O(g) → 2N₂(g) + O₂(g)
molecular weight of N₂O = 44
∆Hº = –166.7 kJ/mol
44 g of N₂O decomposes to give 166.7 kJ of heat
2.25 g of N₂O decomposes to give 166.7 x 2.25 / 44 kJ of heat
= 8.51 kJ of heat .
I believe it was Hiroshima. Followed by Nagasaki. Moscow was never bombed in my knowledge, and Auschwitz was a death camp, so it wasn't bombed.
The data set is missing in the question. The data set is given in the attachment.
Solution :
a). In the table, there are four positive examples and give number of negative examples.
Therefore,
and

The entropy of the training examples is given by :

= 0.9911
b). For the attribute all the associating increments and the probability are :
+ -
T 3 1
F 1 4
Th entropy for
is given by :
![$\frac{4}{9}[ -\frac{3}{4}\log\left(\frac{3}{4}\right)-\frac{1}{4}\log\left(\frac{1}{4}\right)]+\frac{5}{9}[ -\frac{1}{5}\log\left(\frac{1}{5}\right)-\frac{4}{5}\log\left(\frac{4}{5}\right)]$](https://tex.z-dn.net/?f=%24%5Cfrac%7B4%7D%7B9%7D%5B%20-%5Cfrac%7B3%7D%7B4%7D%5Clog%5Cleft%28%5Cfrac%7B3%7D%7B4%7D%5Cright%29-%5Cfrac%7B1%7D%7B4%7D%5Clog%5Cleft%28%5Cfrac%7B1%7D%7B4%7D%5Cright%29%5D%2B%5Cfrac%7B5%7D%7B9%7D%5B%20-%5Cfrac%7B1%7D%7B5%7D%5Clog%5Cleft%28%5Cfrac%7B1%7D%7B5%7D%5Cright%29-%5Cfrac%7B4%7D%7B5%7D%5Clog%5Cleft%28%5Cfrac%7B4%7D%7B5%7D%5Cright%29%5D%24)
= 0.7616
Therefore, the information gain for
is
0.9911 - 0.7616 = 0.2294
Similarly for the attribute
the associating counts and the probabilities are :
+ -
T 2 3
F 2 2
Th entropy for
is given by :
![$\frac{5}{9}[ -\frac{2}{5}\log\left(\frac{2}{5}\right)-\frac{3}{5}\log\left(\frac{3}{5}\right)]+\frac{4}{9}[ -\frac{2}{4}\log\left(\frac{2}{4}\right)-\frac{2}{4}\log\left(\frac{2}{4}\right)]$](https://tex.z-dn.net/?f=%24%5Cfrac%7B5%7D%7B9%7D%5B%20-%5Cfrac%7B2%7D%7B5%7D%5Clog%5Cleft%28%5Cfrac%7B2%7D%7B5%7D%5Cright%29-%5Cfrac%7B3%7D%7B5%7D%5Clog%5Cleft%28%5Cfrac%7B3%7D%7B5%7D%5Cright%29%5D%2B%5Cfrac%7B4%7D%7B9%7D%5B%20-%5Cfrac%7B2%7D%7B4%7D%5Clog%5Cleft%28%5Cfrac%7B2%7D%7B4%7D%5Cright%29-%5Cfrac%7B2%7D%7B4%7D%5Clog%5Cleft%28%5Cfrac%7B2%7D%7B4%7D%5Cright%29%5D%24)
= 0.9839
Therefore, the information gain for
is
0.9911 - 0.9839 = 0.0072
Class label split point entropy Info gain
1.0 + 2.0 0.8484 0.1427
3.0 - 3.5 0.9885 0.0026
4.0 + 4.5 0.9183 0.0728
5.0 -
5.0 - 5.5 0.9839 0.0072
6.0 + 6.5 0.9728 0.0183
7.0 +
7.0 - 7.5 0.8889 0.1022
The best split for
observed at split point which is equal to 2.
c). From the table mention in part (b) of the information gain, we can say that
produces the best split.