Ads 468x60px

Monday, 25 December 2017

How to Download Everything From A Website Automatically

WGet

Have you ever been across a website that contains a lot of useful files that you want to store on your hard drive but you just can't because there aren't any download buttons?

I'm a student and so when I'm browsing the web for study material I often encounter such sites. Past papers, worksheets and books are usually stored separately on websites and so downloading all of them manually, one by one is really tiring.

Let me introduce you to a very small file that has the solution to this problem. The program is only 160 KB in size but trust me, it's a Swiss-army knife when it comes to downloading from the internet.

Originally built for Linux about 21 years ago (in 1996), WGet is a powerful program that retrieves content from web servers. Its name is composed of "World Wide Web" and "Get". It allows users to recursively download files and can even locate hidden files (files that aren't displayed publicly in a site's navigation).

It's not very difficult to use but just to make things easier for everyone, I've put together a little program in MS-Batch that does all the work for you.

If you encounter any bugs with this script feel free to comment below and I'll try to help out to the best of my ability.
Keep in mind that this program won't allow you to download files from websites that are behind firewalls like Cloudflare, including this website.


Download Now!



0 comments:

Post a Comment