New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot use database after moving from local machine to NAS #1507
Comments
Interesting, you're the second user in the last few days to report this. I cannot think of any reason this shouldn't work. The entire reason the export database is kept with the export is so the entire export tree can be easily moved while preserving export state and I've tested this. I will do some testing but don't have a NAS to test with so will see what i can replicate. The error |
@odedia @cukal I have replicated this issue. I don't have a NAS but created an SMB share on another Mac on my network and followed these steps:
OSXPhotos crashed with I used the same rsync command to copy the export directory to a new path on the local Mac and the export worked as expected:
Then I created a new directory on the network volume and ran OSXPhotos against this:
and this worked as expected. OSXPhotos created a new export database and did the export. Re-running the export with Finally, I used rsync to copy the export from step 6 (newly created export on remote volume) to the local mac and re-ran the export:
And this worked fine! So the issue appears to happen only when copying a SQLite database from the local machine to the remote machine. Give this only happens when opening the copied SQLite database on the network volume I suspect this is some sort of weird SQLite thing but I will instrument the OSXPhotos code and see if I can get any clues. |
More data: I could not create this with a simple test:
import sqlite3
import sys
# Connect to SQLite database
try:
db_name = sys.argv[1]
except IndexError:
sys.exit('No database name provided')
conn = sqlite3.connect(db_name)
# Create a cursor object
cur = conn.cursor()
# Execute a SQL query to fetch all rows from mytable
cur.execute('SELECT * FROM mytable')
# Fetch all rows as a list of tuples
rows = cur.fetchall()
# Print all rows
for row in rows:
print(row)
# Close connection
conn.close() |
Interesting. I wonder if SQLite uses some file sockets that only work when it's not on a network drive. |
More debug output leads to more questions. The OSXPhotos
|
Except that is does work when SQLite creates the database on the network drive. Very strange! |
The sqlite docs recommend against using SQLite over a network connection. That's why OSXPhotos implements In the meantime, I recommend you use the |
Thanks for looking into it, glad I'm not alone in being stumped. I'll use the |
The thing is - in my original issue, I did provide the |
Yes, I think I need to just update instructions to not store export db on network volume and use |
See also #1419 Will need to test whether a DB copied to the NAS by user then copied back to the Mac by OSXPhotos can be opened properly. |
See also this |
I'm used my local machine to do a first run at my library, and then migrated the entire folder to my NAS (TrueNAS, ZFS, shared via SMB).
I then tried to run the command again using
--update --ramdb
, but it seems like osxphotos couldn't access the database.This is the error I get:
This is the contents of osxphotos_crash.log:
I then decided to copy the database file to my local machine and that seems to have worked:
I still prefer to keep the database close to the photos so using the ramdisk option would be preferred. If there's anything I can do to correct this I'd be happy to try it out.
Thanks.
The text was updated successfully, but these errors were encountered: