I think a short recap is in order.
1) Batoto doesn't want to prevent downloading altogether, because this is impossible.
Whatever man makes, man breaks.
2) Instead, they are looking for a way to reduce the bandwidth hogging of crawlers which update frequently.
Reducing their potential footprint to the latest chapters would cut off probably 80-90% of the traffic generated by these crawlers, so that's a lot less load on Batoto servers.
3) The biggest difference (I think) is that humans take time to read the pages before clicking, bots do not.
If it were possible to somehow implement a delay system in loading pages - like say, a 5 second timer before the next page is allowed to load,
that might slow down the bots pretty significantly, since they are forced to wait no matter what, thus reducing their bandwidth load.