What do you use if not curl or wget

I have a site that I need to get all images from as well as retain a little structure, so because I am in os x I tried the built in curl first, which works, however every image it downloads as a .jpg is not able to open after download.
I installed homebrew and wget, and tried the following in my terminal

wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/hippo-bull-breaking-the-waters-surface-in-the-timbavati-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/a-leap-of-leopards-high-in-the-african-trees-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/a-botswana-sunrise-near-kwando-lagoon-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/zebra-of-malilangwe-reserve-in-zimbabwe-at-the-waters-edge-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/wildlife-abounds-in-etosha-national-park-in-namibia-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/wildebeest-at-sunset-in-the-grassy-plains-of-africa-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/tropical-paradise-in-the-seychelles-at-mahe-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/stunning-displays-as-birds-flock-to-the-kenyan-rivers-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/pride-of-lions-drinking-from-the-african-waterhole-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/luxury-and-tranquility-at-alphonse-island-in-the-seychelles-huge.jpg";
wget -P ~/Downloads/ss-images/about-africa-collection/ "https://www.africacollection.com/southern-africa-and-indian-ocean-package-holiday-images/light-aircraft-flying-low-over-the-sossusvlei-huge.jpg";

For some unknown reason it starts running great, but if you see what I have above, keeping in mind there are 7000 of these to run, it starts just randomising folder names after the first 10 - 15 are run, in other words i land up with 5 images in one folder and then 1 in another and 3 in another, and I can not determine any logic as to how or why.
This is kind of the directory structure it gives as a result

/ss-images/about-africa-collection/hippo-bull-breaking-the-waters-surface-in-the-timbavati-huge.jpg
/ss-images/about-africa-collection/a-leap-of-leopards-high-in-the-african-trees-huge.jpg
/ss-images/wbout-africa-collection/a-botswana-sunrise-near-kwando-lagoon-huge.jpg
/ss-images/awout-africa-collection/zebra-of-malilangwe-reserve-in-zimbabwe-at-the-waters-edge-huge.jpg
/ss-images/abwut-africa-collection/wildlife-abounds-in-etosha-national-park-in-namibia-huge.jpg
/ss-images/abowt-africa-collection/wildebeest-at-sunset-in-the-grassy-plains-of-africa-huge.jpg
/ss-images/abouw-africa-collection/tropical-paradise-in-the-seychelles-at-mahe-huge.jpg
/ss-images/about-wfrica-collection/stunning-displays-as-birds-flock-to-the-kenyan-rivers-huge.jpg
/ss-images/about-awrica-collection/pride-of-lions-drinking-from-the-african-waterhole-huge.jpg
/ss-images/about-afwica-collection/luxury-and-tranquility-at-alphonse-island-in-the-seychelles-huge.jpg
/ss-images/about-afrwca-collection/light-aircraft-flying-low-over-the-sossusvlei-huge.jpg

Look at the word about, it just starts replacing letters with ā€œwā€ and moves one letter forward and does it again recursively.
As a result, not only does it produce the incorrect result but it also throws hundreds of 404 errors as the file name can not be found and does not exist on the server.

So my question is, why is wget doing this, or is there another tool i can use to produce something similar without it renaming stuff just as it feels the need to.

Iā€™m curious about the trailing semi colon. What is that for?

1 Like

trying to terminate after every command, maybe I should rather try &&, i think that may just wait for success of the last command before running the next, give ma a moment to try, i wonder.

A vanilla terminal terminates with just line feed, ya?

Also, I think wget supports passing a text file of urls to retrieve.

Just replaced ; with && and it is running like a dream now, exactly what i wanted, thank you for questioning the terminator.
I used it because i was copy pasting 7000 lines of this, and figured it would be the safest bet.

I still have to wonder what made this thing start throwing random wā€™s all over the place, very odd

1 Like