Answer:
1. I think it was actually a very good idea. As time goes on technology is improved, new laws and theories are proven, scientists are disproven, and efficiency of doing certain tasks evolves. Another reason would be, why not? It does not hurt to make a hypothesis or try a new approach.
1. We spend countless time exploring and digging only to possible never find what we need to exactly pin the first life forms on Earth and when they lived, why not try a new different angle. So, I think that it was a very clever and interesting to use and Moore's Law to hypothesis the origin of life.
2. Moore's law applies to the efficiency of algorithms because he came up with a way to hypothesis were in technology we should stand after a certain amount of time and how much improvements should be integrated into our computers.
2. Moore’s Law applies to the efficiency of algorithms because Moore created a exponential equation and graph of a hypothesis proven to help guide engineers on a steady path of improvement should be made to computer processors and components with in a certain amount of time.
3. In a 10 years I think computers would have only made simply yet very effective and efficient improvements. Like longer battery life, or faster speeds, but virus firewalls and defenders. But when faced with 100 years I think that would have lead to a huge drastic change to what the normal idea of a computer is. It would be so high tech to us but with the world our grandkids are living in it'll be normal to have holographic computers.
3. I think that in 10 years computers would have improved much like they have recently, smaller, more compact, faster, and longer lasting, but within 100 years we would have a another invention entirely. Computers would have changed so much more than we could have every thought.
Explanation: I put a few different answers for each question so you can mix and match and use what's easiest for you but both are correct.