| Hello,
For testing purposes, I want to download a part of my personal website + all
external images in it, but only the first level (as sometimes websites have
links in a .png page, for example Wikipedia).
I put these filters:
-*
+https://notes.ailothaen.fr/
+https://notes.ailothaen.fr/post/*
+https://notes.ailothaen.fr/page/*
+*.jpg
+*.png
+*.gif
-*#*
My archival starts on <https://notes.ailothaen.fr/>, and I also put the option
"External depth" to 1.
However, only notes.ailothaen.fr/index.html and all the links that are
directly connected to it are saved - making me think that notes.ailothaen.fr
is considered as a "external website", despite it is the domain I want to
save.
Is this normal? I saw another topic from some years ago stating that the
External depth parameter was broken.
Thank you. | |