Author Topic: Linux programs and Bash scripts to consolidate photos and begin a slideshow  (Read 11334 times)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Earlier this week, a backup drive of mine crashed... it doesn't seem to be recoverable without sending it to a specialist lab with a cleanroom. (I didn't know what a "cleanroom" was before this week, so obviously this isn't something that I'd have access to at work.) Anyway, I'm pretty sure that I've lost very little overall, if anything. It was one of 3 backup drives, but it failed the day after I had run an rsync process to copy over some files that were still missing from one of the 3.

Of course, I noticed that on my personal computer, I didn't have all the family photos that were available on the storage drive, even though I have plenty of space for them. To remedy that, I thought that now would be a good time to consolidate some photos, since they had been collected up in all different locations over the years. My personal computer is running Ubuntu Desktop 16.04 currently, while several other computers in the house over the years have primarily been Windows computers. Fortunately, two primary folder names have been used, regardless of which OS they were on - "Photos" and "Pictures".

To begin, I decided to create a text file containing all the paths with either of those names. Since I have my storage drive connected to my laptop under /mnt/toshiba, I opened up the command line and started the search there.

Code: [Select]
find /mnt/toshiba -name Photos >> DirectoryList-Photos.txt
find /mnt/toshiba -name Pictures >> DirectoryList-Pictures.txt

After creating those two files and verifying that the directories listed, I went on to creating the Bash script which would read through the files and, then use the paths listed on each line to copy everything to my Pictures directory on my laptop. One thing that I noticed before starting this part though, is that files with duplicate names are going to get overwritten. Since I want to due some purging of duplicates anyway, I'm not too concerned with that, but I am still going to tackle each list separately at least for partial separation between running my backups. Here is my Bash script:

Code: [Select]
#!/bin/bash
# Copy files from the Photos and Pictures directories of the Toshiba backup to my Sony Vaio HD.

SOURCES=/home/dragon/DirectoryList-Photos.txt
DESTDIR=/home/dragon/Pictures/

while read LINE; do
[ -z "$LINE" ] && continue # skip blank lines
rsync -avh --exclude='.*' "$LINE/" "$DESTDIR" --log-file=/home/dragon/rsync-toshiba-to-vaio.log
done < $SOURCES

Once that was done, I was able to jump into my Pictures directory and start up my slideshow with these commands:

Code: [Select]
cd ~/Pictures/
eog --slide-show ./
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
I found another Linux program for browsing images which looks pretty nice too... it's called gThumb Image Viewer.

Code: [Select]
sudo apt-get install gthumb
It has a Presentation mode, for running a slideshow, an Organization option, for grouping photos by date or other criteria (without moving them to new locations), and even a feature to Find Duplicates. Since I'm still moving files over to my laptop from my storage drive, I haven't tried all these features out yet, but I do intend to check them out soon.

One thing that I have already noticed about gThumb is that it loads up pretty quickly, much faster than the Nautilus File Manager when viewing a directory with hundreds of photos. On the flip side, it appears to have a delay detecting new files. Since I'm in the middle of copying 11 GB of images from one drive to another, I've noticed that several directories came up with a message of simply "(Empty)" even though I could navigate to the same directory through Nautilus and find plenty of files.
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Since my local hard drive had gotten filed up quickly, I had to change where my staging directory was going to be at. I have a 2TB drive mounted at /media/dragon/BluePortableDrive/ with a folder in there called PhotoStage. All of my files were copied there, then gthumb was used to remove the duplicate files which I did after copying smaller batches of files to the PhotoStage directory. Now that I've had some time to get all the files copied into a single directory, it's time for me to put them back to my storage drive with my consolidated location.

Code: [Select]
rsync -avh "/media/dragon/BluePortableDrive/PhotoStage/" "/mnt/toshiba/Photo/"
In order to handle the same process simply with our other Windows computers in the house, I've looked around for some "rsync for Windows" programs, but the ones that I found were just too cumbersome for what I wanted. Options like DeltaCopy and cwRsync were a couple that I noticed, but ultimately xcopy seemed to be the answer I was looking for. After using the Windows Map Network Drive to connect "V:" to my PhotoStage folder, I ran this simple command from Windows Command Line to start copying all of the files from my PhotoStage to the current directory of that PC:

Code: [Select]
xcopy V:\ *.* /e
Check out http://www.computerhope.com/xcopyhlp.htm to see more about the options and examples for using Windows xcopy.

Since I have another computer on our network that was able to connect to the root of our storage drive using "T:" having the slash at the end of the directory path to Photo caused a problem finding the path, so that xcopy command was just slightly different:

Code: [Select]
xcopy T:\Photo *.* /e
I've also found that robocopy is an updated version of xcopy that is also built into Windows. Syntax is slightly different, but I'm trying this another computer to copy the same files:

Code: [Select]
robocopy T:\Photo C:\Users\Public\Pictures /e
Since robocopy shows much more, such as percentage of file transfer, if it's a new file, and file size, just with this simple command, this is probably going to be the best method for setting up my automated tasks.  I also noticed that robocopy was able to complete the task more quickly, possibly because it was a different path, but even after twice as much time, the xcopy process hadn't completed and appeared to be hung up on transferring one specific video file. Looking into the new directory, I see that the subdirectories hadn't been copied at all, which was location of a majority of photos, so I just cancelled the task and ran robocopy on that computer instead also.
« Last Edit: October 23, 2016, 20:41:50 by Dragon »
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
It had been a while since I've done any photo purging since we've been using Google Photos to store new pictures, but I wanted to get back to my old storage drives. I was going through some old photos recently and gthumb was being slow. My wife came in and was making some suggestions to be helpful and we talked some about what I was looking to do. My primary concern being that it needed to group photos using the folder structure of the native operating system rather than relying solely on database records that wouldn't be recognized on another computer without the same program.

While I continued to do some cleanup, my wife discovered a program called Digikam, which is an open source photo management tool that has a Duplicate finder along with other tools. I got it installed and although it still took a while to search through all the photos that I have, the Duplicates feature is very nice. Preview images of the photos along with the path to the photos are visible during the process. I have seen in other programs where you could select something like "Delete all except one" which I have NOT yet seen in Digikam, but aside from that, I've found it to be quite user friendly.

Changing the Album that a photo was in and renaming the photos does so on the native file structure. There is also a SQLite Database that stores information locally about the images. Options to update metadata on the actual files does exist, although it seems to be discouraged since it slows things down. Considering that I wanted to keep things useful outside of this application as much as possible, having a little slower response in order to have the metadata stored on the actual photo files seems like a good trade, although I haven't yet tried updating any metadata there.
« Last Edit: April 28, 2019, 09:30:19 by Dragon »
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
In the process of moving photos from one location to another, especially uploading to Google Photos, I've come across some corrupt video files. Lots of times we have videos of things that could have been photos, but not to digress from my purpose of this post, I found a nice little way to extract photos from some AVI files even though Google isn't able to upload the file in video format. In these cases, I had some videos which were able to play a little after opening the file locally, but would error out before getting to the end.

This is for use in Linux. This command will take MVI_1271.AVI, extract an image every 1 second and store it in a file called MVI_1271-01.jpeg, then MVI_1271-02.jpeg, MVI_1271-03.jpeg, and so on. Just replace MVI_1271.AVI with the video file you have, and the destination at the end of the line. Also, %2d at the end will form 2-digit suffixes while %3d will form 3-digit suffixes on your extracted image files.

Code: [Select]
ffmpeg -i MVI_1271.AVI -r 1 -f image2 MVI_1271-%2d.jpeg
I didn't have ffmpeg installed to start with, but it was simple to install it:

Code: [Select]
sudo apt install ffmpeg
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Another thing that might be helpful if you're trying to upload to Google Photo with drag-and-drop onto an album page... you can drag a folder onto it to upload the whole folder, but it doesn't seem to pick up anything more than one level deep. Also, you can't just upload a zip file there. You could onto Google Drive, but not Google Photos. If you have a bunch of zip files that were created for storing a photos from a single directory, you might want to extract them all into the same folder. If they are in multiple levels of directories, and you're confident that you don't have duplicate filenames for what are actually different photos, you could run a command like this to grab all the readable files from 2 levels down or more and put it up on the current directory:

Code: [Select]
find ./ -mindepth 2 -readable -type f -exec mv {} ./ \;
Want to try this out with a prompt, try the ok option to be asked for each file:

Code: [Select]
find ./ -mindepth 2 -readable -type f -ok mv {} ./ \;
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Have a list of filenames (even if there are extra line spaces in between) to move and want something to move them quickly? This simple one-liner worked great for me and is easy to understand, as long as you know that those are back-ticks, not single-quote marks.

Code: [Select]
mv `cat /tmp/list.txt` /app/dest/source = https://unix.stackexchange.com/questions/115734/move-file-by-list-in-file-with-leading-whitespace
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Need to reorganize some of those automatically created folders that Google exports with takeout.google.com?

Code: [Select]

#!/bin/bash
# Run rsync then remove files through multiple folders like so:
# rsync -av Photos\ from\ 2007/* 2007/ && rm -r Photos\ from\ 2007

for YEAR in {1985..2021}; do
        if [ -d "Photos from $YEAR" ]; then
                for NUM in {0..9}; do
                        if [[ $(ls -A Photos\ from\ $YEAR/$NUM*) ]]; then
                                rsync -av Photos\ from\ $YEAR/$NUM* $YEAR/ && rm -r Photos\ from\ $YEAR/$NUM*
                        else
                                echo "Skip $NUM"
                        fi
                done
                for LTR in {A..Z}; do
                        if [[ $(ls -A Photos\ from\ $YEAR/$LTR*) ]]; then
                                rsync -av Photos\ from\ $YEAR/$LTR* $YEAR/ && rm -r Photos\ from\ $YEAR/$LTR*
                        else
                                echo "Skip $LTR"
                        fi
                done
                rsync -av Photos\ from\ $YEAR/* $YEAR/ && rm -r Photos\ from\ $YEAR
        else
                echo "A directory called 'Photos from $YEAR' doesn't exist."
        fi
done

"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
I found out about a program called fdupes to find duplicate files on Linux. It just runs on command line, but seems pretty quick and has some convenient options for deleting the duplicates with or without prompts. Here's the first two commands for installing and scanning the Pictures directory.

Code: [Select]
sudo apt-get install fdupes
fdupes -r ~/Pictures

The default functionality is just to list the duplicates in groups. The '-r' option is for recursive checks in subdirectories. Including the '-d' option will prompt the user to save 1 or more file while deleting the other duplicates. There is also the '--noprompt' option, which is for deleting files without asking the user. 
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
I've been using fdupes on Linux for a couple days now. Very nice, simple process for the most part. Another cool thing is that fdupes is available for Mac via HomeBrew. (https://formulae.brew.sh/formula/fdupes)

I'd prefer if it had built-in functionality to have a preferred directory to preserve the files when deleting duplicates, but that's my only complaint. Overall, still better than others methods I've used.
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Still sorting some things out, due to lots of duplicate files and then lots of other places where files ended up. I've used rsync to get a bunch of files back into my Photos directory on my storage drive, but many of them in the root directory didn't have file names. I found that the timestamp on many were appropriate, unlike the ones that were downloaded from Google Photos, which hadn't maintained their original timestamps. Thinking this might be a good time to get the files renamed with their date rather than leaving it up to the metadata, I've put together this script:

Code: [Select]
if [ $# -ne 1 ]; then
    echo "$# args given - need one to run the process"
    exit 1
fi

#YEAR=2000
#NEXT=2001
OIFS="$IFS"
IFS=$'\n'

echo "Moving files beginning with $1"
#echo "Also moving files from $YEAR to $NEXT"

echo "" > tmp.txt
#find ./ -maxdepth 1 -type f -name "$1*.jpg" > tmp.txt
find ./ -maxdepth 1 -type f -name "$1*" >> tmp.txt
#find ./ -maxdepth 1 -type f -newermt "$YEAR-01-01 00:00:00" -not -newermt "$NEXT-01-01 00:00:00" >> tmp.txt

for file in `cat tmp.txt` 
do
if [ $file == $0 ]; then
echo "Not changing $file"
exit 1
fi
moddate=`stat --format="%y" $file | awk '{print $1}'`
# newfilename=`stat --format="%N" $file`
newfilename=${file// /_}
newfilename=${newfilename/\.\//_}
newfilename=${newfilename//:/-}
newfilename=${moddate}${newfilename}
echo $file " to " $newfilename
mv $file $newfilename

done
IFS="$OIFS"


Sections commented out were helpful during testing but not what I wanted in the end.
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Regarding my last post about my dragonRename.sh, I tried this on my iMac (MacOS Catalina). The stat function doesn't work the same on Mac as it does on my Ubuntu Linux machine, so the process failed on that stat function, throwing some "illegal option" errors.
« Last Edit: February 08, 2021, 13:18:36 by Dragon »
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
I've added more to my dragonRename script. Now it will take a second argument to determine the max depth that it will go down to pick out matching files. They will be renamed and moved to the current directory, but will include the name of the directory in the new filename.

Code: [Select]
#!/bin/bash
# Find files beginning with the argument given, then rename the files
# to use include the timestamp in the filename and remove spaces.
# E.G.
#-rwxrwxr-x 1 dragon dragon     15319 2000-10-25  Nathaniel1.jpg
# to be renamed 2000-10-25_Nathaniel1.jpg
#-rwxrwxr-x 1 dragon dragon     14778 2000-09-08  Tim & Autumn.jpg
# to be renamed 2000-09-08_Tim_&_Autumn.jpg


#for f in $(find ./ -maxdepth 1 -newermt "2000-01-01 00:00:00" -not -newermt "2001-01-01 00:00:00"); do
# echo "$f"
#done
if [ $# -lt 1 ]; then
    echo "$# args given - need at least one to run the process."
    exit 1
elif [ $# -gt 2 ]; then
    echo "$# args given - need one to run the process and optionally one for the directory depth."
    exit 1
elif [ $1 = "--help" ]; then
    echo "Usage: $0 <YYYY-MM-DD> [DEPTH=1]"
    echo "Find files beginning with the argument given, then rename the files "
    echo "to use include the timestamp in the filename and remove spaces. "
    exit 1
fi

dirname='./'
if [ -z "$2" ]; then
    dirdepth=1
elif [ "$2" -gt 0 ]; then
    dirdepth=$2
else
    dirdepth=1
fi

#YEAR=2000
#NEXT=2001
OIFS="$IFS"
IFS=$'\n'

echo "Moving files beginning with $1 in $dirname and maxdepth of $dirdepth"
#echo "Also moving files from $YEAR to $NEXT"

echo "" > tmp.txt
#find ./ -maxdepth 1 -type f -name "$1*.jpg" > tmp.txt
#find ./ -maxdepth 1 -type f -newermt "$YEAR-01-01 00:00:00" -not -newermt "$NEXT-01-01 00:00:00" >> tmp.txt
#find $dirname -maxdepth 1 -type f -name "$1*" >> tmp.txt
find $dirname -maxdepth $dirdepth -type f -name "$1*" >> tmp.txt

echo "Matching files found:"
cat tmp.txt
echo ""
echo "The above files will be renamed here."
read -p "Continue? [y/N]:" prompt

if [ "$prompt" = "y" ]; then
echo "Renaming files..."
else
echo "Exiting without renaming."
exit 1
fi

for file in `cat tmp.txt` 
do
if [ $file == $0 ]; then
echo "Not changing $file"
exit 1
fi
moddate=`stat --format="%y" $file | awk '{print $1}'`
# newfilename=`stat --format="%N" $file`
newfilename=${file// /_}
newfilename=${newfilename/\.\//_}
newfilename=${newfilename//:/-}
newfilename=${newfilename//\//_}
newfilename=${moddate}${newfilename}
echo $file " to " $newfilename
mv $file $newfilename

#     echo "file = $file"
#     diff "$file" "/some/other/path/$file"
#     read line </dev/tty
done
IFS="$OIFS"
find ./ -empty -type d -delete

Example of use:

~/dragonRename.sh M 5
Moving files beginning with M in ./ and maxdepth of 5
Matching files found:
 
./2016-08/MVI_2507 (1).MOV
./Amazon Drive/2016-10/MVI_2921.MOV

The above files will be renamed here.
Continue? [y/N]:y
Renaming files...
./2016-08/MVI_2507 (1).MOV to 2016-08-17_2016-08_MVI_2507_(1).MOV
./Amazon Drive/2016-10/MVI_2921.MOV to 2016-10-22_Amazon_Drive_2016-10_MVI_2921.MOV
« Last Edit: October 16, 2021, 17:04:45 by Dragon »
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)

Dragon

  • Administrator
  • Platinum Level
  • *****
  • Posts: 4862
  • Have you played my board game?
    • Rival Troops
Here is an example that I've run using fdupes to recursively check through the Documents and Downloads directories, then delete duplicates without prompting the user (leaving the 1st found, starting in Documents), and log the results in a file on the Desktop:

Code: [Select]
fdupes -R Documents Downloads -d -N >> ~/Desktop/fdupes.log.txt
And here's another, checking for duplicate files recursively in the user's entire directory, logging the results, but not deleting anything.

Code: [Select]
fdupes -R ~/ >> ~/Desktop/fdupes.log.txt
WARNING: DO NOT just recursively delete duplicate files from your entire home directory. Using the above command to check for duplicates and find out where you might want to run it is the safe way to go, but definitely do the search without the delete at least once to find where those extra things are at. The fdupes program will find all kinds of matching files, more than just duplicate photos, so you'll want to at least run the command to check through some directories before others and probably want to avoid Library sub-directories, cache folders, and maybe even Downloads, depending on the organization of things, so that you don't end up deleting important files that might be duplicated for multiple applications. 
« Last Edit: October 09, 2021, 22:25:10 by Dragon »
"Hello IT. Have you tried turning it off and on again? ... OK, well, the button on the side. Is it glowing?... Yeah, you need to turn it on. Err, the button turns it on. Yeah, you do know how a button works, don't you? No, not on clothes." - Roy (The IT Crowd)