Comment by imvetri

Comment by imvetri 13 hours ago

3 replies

A web crawler requires me to program the logics.

The concept does not require me to program the logics, instead it starts reading a seed page, stores information/knowledge breaks it down to doable actions and performs action one by one.

This design allows the information to direct the concept. Where as, in case of web crawler, me/I will have to direct it.

giveita 5 hours ago

You want the system to have learned (discretion, taste...), but it cannot learn from a human programing logic, so it must learn either from

1. Something like Machine Learning

or..

2. Some emergent property of mathematics / computation.

If you find 2... hell that would be something.

Maybe genetic algorithms?

fsflover 9 hours ago

A good web crawler can just go through all links it finds, no programming is required.