美国航空航天局图片下载 [英] Nasa Image download

查看:125
本文介绍了美国航空航天局图片下载的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想使用以下选项从美国国家航空航天局下载图像.

I want to download image from Nasa with the following options.

  1. 鉴于特定的日期,脚本应该能够下载该日期发布的图像

  1. Given a specific date, the script should be able to download the image posted on that date

给定日期,脚本应该能够下载标题,说明文字和字幕

Given a specific date, the script should be able to download the title, explanation text and credits

给定日期,脚本应该能够下载标题,说明文字和字幕

Given a specific date, the script should be able to download the title, explanation text and credits

以下是我尝试过的代码,但功能不完全.

Following is the code I tried but its not fully functional.

GET_DESCRIPTION="yes"

PICTURES_DIR=~/Pictures

DESCRIPTION_DIR=~

function get_page {
    echo "Downloading page to find image"
    wget http://apod.nasa.gov/apod/ --quiet -O /tmp/apod.html
    grep -m 1 jpg /tmp/apod.html | sed -e 's/<//' -e 's/>//' -e 's/.*=//' -e 's/"//g' -e 's/^/http:\/\/apod.nasa.gov\/apod\//' > /tmp/pic_url
}

function save_description {
    if [ ${GET_DESCRIPTION} == "yes" ]; then
        echo "Getting description from page"
        # Get description
        if [ -e $DESCRIPTION_DIR/description.txt ]; then
            rm $DESCRIPTION_DIR/description.txt
        fi

        if [ ! -e /tmp/apod.html ]; then
            get_page
        fi

        echo "Parsing description"
        sed -n '/<b> Explanation: <\/b>/,/<p> <center>/p' /tmp/apod.html |
        sed -e :a -e 's/<[^>]*>//g;/</N;//ba' |
        grep -Ev 'Explanation:' |
        tr '\n' ' ' |
        sed 's/  /\n\n/g' |
        awk 'NF { print $0 "\n" }' |
        sed 's/^[ \t]*//' |
        sed 's/[ \t]*$//' > $DESCRIPTION_DIR/description.txt
    fi
}

TODAY=$(date +'%Y%m%d')

if [ ! -e ~/Pictures/${TODAY}_apod.jpg ]; then
    echo "We don't have the picture saved, save it"

    get_page

    PICURL=`/bin/cat /tmp/pic_url`

    echo  "Picture URL is: ${PICURL}"

    echo  "Downloading image"
    wget --quiet $PICURL -O $PICTURES_DIR/${TODAY}_apod.jpg

    echo "Setting image as wallpaper"
    gconftool-2 -t string -s /desktop/gnome/background/picture_filename $PICTURES_DIR/${TODAY}_apod.jpg

    save_description

else
    get_page

    PICURL=`/bin/cat /tmp/pic_url`

    echo  "Picture URL is: ${PICURL}"

    SITEFILESIZE=$(wget --spider $PICURL 2>&1 | grep Length | awk '{print $2}')
    FILEFILESIZE=$(stat -c %s $PICTURES_DIR/${TODAY}_apod.jpg)

    if [ $SITEFILESIZE != $FILEFILESIZE ]; then
        echo "The picture has been updated, getting updated copy"
        rm $PICTURES_DIR/${TODAY}_apod.jpg


        PICURL=`/bin/cat /tmp/pic_url`

        echo  "Downloading image"
        wget --quiet $PICURL -O $PICTURES_DIR/${TODAY}_apod.jpg

        echo "Setting image as wallpaper"
       $PICTURES_DIR/${TODAY}_apod.jpg

        save_description
    else
        echo "Picture is the same, finishing up"
    fi
fi

请问我对bash还是很陌生,我从GitHub找到了上面的代码.这不是我的工作.我可以理解代码中正在发生的事情,但它并没有满足我的要求.请帮助

Please I am very new to bash and I found the above code from GitHub. This is not my work. I can understand what's happening in the code but it's not doing what I want. Please help

推荐答案

要修改现有代码以下载特定日期,请更改:

To modify your existing code to download a specific date, change:

TODAY=$(date +'%Y%m%d')

收件人:

TODAY=$1

并通过像这样运行脚本来将日期传递给脚本:

and pass your date to your script by running your script like this:

./nasa.sh 20191031

图像将保存在~/Pictures中,说明将另存为~/description.txt. (~表示您的主目录.)您可以通过更改脚本顶部的以下变量分配来更改图像和描述目标目录:

The image will be saved in ~/Pictures, and the description will be saved as ~/description.txt. (~ means your home directory.) You can change the image and description destination directories by changing these variable assignments at the top of the script:

PICTURES_DIR=~/Pictures
DESCRIPTION_DIR=~

PS:删除有关将图像设置为桌面墙纸的这些行:

PS: delete these lines about setting the image as your desktop wallpaper:

echo "Setting image as wallpaper"
gconftool-2 -t string -s /desktop/gnome/background/picture_filename $PICTURES_DIR/${TODAY}_apod.jpg

这篇关于美国航空航天局图片下载的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆