How to use wget on a list of images











up vote
6
down vote

favorite
3












There's this beautiful boy who periodically uploads pictures of himself to his website. I am trying to automate the process of downloading these images to my computer.



So far, I'm able to download his webpage and parse it for jpg files. I end up with a file like this.



http://stat.ameba.jp/user_images/20120129/19/maofish/f9/60/j/o0480064011762693689.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120127/22/maofish/f7/3e/j/t02200293_0480064011759076335.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120125/18/maofish/80/46/j/t02200293_0480064011755033425.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120120/20/maofish/3c/99/j/t02200290_0480063311745603530.jpg
http://stat.ameba.jp/user_images/20100219/16/maofish/33/0b/j/t01400198_0140019810420649113.jpg
http://stat.ameba.jp/user_images/b0/09/10101851128_s.jpg
http://stat.ameba.jp/user_images/9c/26/10027225053_s.jpg


I can get any of those images by doing a wget on them, but I would like to automate this process to get everything in the list. I tried piping and redirecting to wget, but it doesn't work. How can I accomplish what I'm trying to do?










share|improve this question






















  • What have you already tried?
    – r4.
    Jan 30 '12 at 9:10






  • 1




    I have tried wget | list and wget < list.
    – tony_sid
    Jan 30 '12 at 9:12















up vote
6
down vote

favorite
3












There's this beautiful boy who periodically uploads pictures of himself to his website. I am trying to automate the process of downloading these images to my computer.



So far, I'm able to download his webpage and parse it for jpg files. I end up with a file like this.



http://stat.ameba.jp/user_images/20120129/19/maofish/f9/60/j/o0480064011762693689.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120127/22/maofish/f7/3e/j/t02200293_0480064011759076335.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120125/18/maofish/80/46/j/t02200293_0480064011755033425.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120120/20/maofish/3c/99/j/t02200290_0480063311745603530.jpg
http://stat.ameba.jp/user_images/20100219/16/maofish/33/0b/j/t01400198_0140019810420649113.jpg
http://stat.ameba.jp/user_images/b0/09/10101851128_s.jpg
http://stat.ameba.jp/user_images/9c/26/10027225053_s.jpg


I can get any of those images by doing a wget on them, but I would like to automate this process to get everything in the list. I tried piping and redirecting to wget, but it doesn't work. How can I accomplish what I'm trying to do?










share|improve this question






















  • What have you already tried?
    – r4.
    Jan 30 '12 at 9:10






  • 1




    I have tried wget | list and wget < list.
    – tony_sid
    Jan 30 '12 at 9:12













up vote
6
down vote

favorite
3









up vote
6
down vote

favorite
3






3





There's this beautiful boy who periodically uploads pictures of himself to his website. I am trying to automate the process of downloading these images to my computer.



So far, I'm able to download his webpage and parse it for jpg files. I end up with a file like this.



http://stat.ameba.jp/user_images/20120129/19/maofish/f9/60/j/o0480064011762693689.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120127/22/maofish/f7/3e/j/t02200293_0480064011759076335.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120125/18/maofish/80/46/j/t02200293_0480064011755033425.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120120/20/maofish/3c/99/j/t02200290_0480063311745603530.jpg
http://stat.ameba.jp/user_images/20100219/16/maofish/33/0b/j/t01400198_0140019810420649113.jpg
http://stat.ameba.jp/user_images/b0/09/10101851128_s.jpg
http://stat.ameba.jp/user_images/9c/26/10027225053_s.jpg


I can get any of those images by doing a wget on them, but I would like to automate this process to get everything in the list. I tried piping and redirecting to wget, but it doesn't work. How can I accomplish what I'm trying to do?










share|improve this question













There's this beautiful boy who periodically uploads pictures of himself to his website. I am trying to automate the process of downloading these images to my computer.



So far, I'm able to download his webpage and parse it for jpg files. I end up with a file like this.



http://stat.ameba.jp/user_images/20120129/19/maofish/f9/60/j/o0480064011762693689.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120127/22/maofish/f7/3e/j/t02200293_0480064011759076335.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120125/18/maofish/80/46/j/t02200293_0480064011755033425.jpg
http://imgstat.ameba.jp/view/d/70/stat001.ameba.jp/user_images/20120120/20/maofish/3c/99/j/t02200290_0480063311745603530.jpg
http://stat.ameba.jp/user_images/20100219/16/maofish/33/0b/j/t01400198_0140019810420649113.jpg
http://stat.ameba.jp/user_images/b0/09/10101851128_s.jpg
http://stat.ameba.jp/user_images/9c/26/10027225053_s.jpg


I can get any of those images by doing a wget on them, but I would like to automate this process to get everything in the list. I tried piping and redirecting to wget, but it doesn't work. How can I accomplish what I'm trying to do?







linux wget






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Jan 30 '12 at 9:08









tony_sid

5,55636108175




5,55636108175












  • What have you already tried?
    – r4.
    Jan 30 '12 at 9:10






  • 1




    I have tried wget | list and wget < list.
    – tony_sid
    Jan 30 '12 at 9:12


















  • What have you already tried?
    – r4.
    Jan 30 '12 at 9:10






  • 1




    I have tried wget | list and wget < list.
    – tony_sid
    Jan 30 '12 at 9:12
















What have you already tried?
– r4.
Jan 30 '12 at 9:10




What have you already tried?
– r4.
Jan 30 '12 at 9:10




1




1




I have tried wget | list and wget < list.
– tony_sid
Jan 30 '12 at 9:12




I have tried wget | list and wget < list.
– tony_sid
Jan 30 '12 at 9:12










2 Answers
2






active

oldest

votes

















up vote
13
down vote













You can use -i option of wget such as:



$ wget -i input_file.txt


You will get all files downloaded in the current directory. You can see man wget for more options.






share|improve this answer





















  • Is there a way to tell wget to ignore images that are smaller than a certain size?
    – tony_sid
    Jan 30 '12 at 9:22






  • 1




    If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
    – Khaled
    Jan 30 '12 at 9:26


















up vote
0
down vote













minsize="50" # grab all over 50kb

for x in $(cat list)
do

if [ "$(echo $(GET -Ssed $x | grep Length | awk '{print $2}') /128 |bc)" -ge $minsize ]; then
wget -q $x
fi
done





share|improve this answer























    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "3"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f383890%2fhow-to-use-wget-on-a-list-of-images%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes








    up vote
    13
    down vote













    You can use -i option of wget such as:



    $ wget -i input_file.txt


    You will get all files downloaded in the current directory. You can see man wget for more options.






    share|improve this answer





















    • Is there a way to tell wget to ignore images that are smaller than a certain size?
      – tony_sid
      Jan 30 '12 at 9:22






    • 1




      If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
      – Khaled
      Jan 30 '12 at 9:26















    up vote
    13
    down vote













    You can use -i option of wget such as:



    $ wget -i input_file.txt


    You will get all files downloaded in the current directory. You can see man wget for more options.






    share|improve this answer





















    • Is there a way to tell wget to ignore images that are smaller than a certain size?
      – tony_sid
      Jan 30 '12 at 9:22






    • 1




      If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
      – Khaled
      Jan 30 '12 at 9:26













    up vote
    13
    down vote










    up vote
    13
    down vote









    You can use -i option of wget such as:



    $ wget -i input_file.txt


    You will get all files downloaded in the current directory. You can see man wget for more options.






    share|improve this answer












    You can use -i option of wget such as:



    $ wget -i input_file.txt


    You will get all files downloaded in the current directory. You can see man wget for more options.







    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Jan 30 '12 at 9:17









    Khaled

    6291413




    6291413












    • Is there a way to tell wget to ignore images that are smaller than a certain size?
      – tony_sid
      Jan 30 '12 at 9:22






    • 1




      If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
      – Khaled
      Jan 30 '12 at 9:26


















    • Is there a way to tell wget to ignore images that are smaller than a certain size?
      – tony_sid
      Jan 30 '12 at 9:22






    • 1




      If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
      – Khaled
      Jan 30 '12 at 9:26
















    Is there a way to tell wget to ignore images that are smaller than a certain size?
    – tony_sid
    Jan 30 '12 at 9:22




    Is there a way to tell wget to ignore images that are smaller than a certain size?
    – tony_sid
    Jan 30 '12 at 9:22




    1




    1




    If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
    – Khaled
    Jan 30 '12 at 9:26




    If the target is to remove the small files (less than some threshold), you can use find to delete them automatically after download.
    – Khaled
    Jan 30 '12 at 9:26












    up vote
    0
    down vote













    minsize="50" # grab all over 50kb

    for x in $(cat list)
    do

    if [ "$(echo $(GET -Ssed $x | grep Length | awk '{print $2}') /128 |bc)" -ge $minsize ]; then
    wget -q $x
    fi
    done





    share|improve this answer



























      up vote
      0
      down vote













      minsize="50" # grab all over 50kb

      for x in $(cat list)
      do

      if [ "$(echo $(GET -Ssed $x | grep Length | awk '{print $2}') /128 |bc)" -ge $minsize ]; then
      wget -q $x
      fi
      done





      share|improve this answer

























        up vote
        0
        down vote










        up vote
        0
        down vote









        minsize="50" # grab all over 50kb

        for x in $(cat list)
        do

        if [ "$(echo $(GET -Ssed $x | grep Length | awk '{print $2}') /128 |bc)" -ge $minsize ]; then
        wget -q $x
        fi
        done





        share|improve this answer














        minsize="50" # grab all over 50kb

        for x in $(cat list)
        do

        if [ "$(echo $(GET -Ssed $x | grep Length | awk '{print $2}') /128 |bc)" -ge $minsize ]; then
        wget -q $x
        fi
        done






        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Dec 1 at 16:08









        Scott

        15.5k113889




        15.5k113889










        answered Feb 1 '12 at 23:24









        tao

        1,295710




        1,295710






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Super User!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f383890%2fhow-to-use-wget-on-a-list-of-images%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Probability when a professor distributes a quiz and homework assignment to a class of n students.

            Aardman Animations

            Are they similar matrix