Avatar

GenderNeutralBro

GenderNeutralBro@lemmy.sdf.org
Joined
3 posts • 110 comments
Direct message

Some vegan leather is made of mushrooms or vegetables. https://en.wikipedia.org/wiki/Plant-based_leather#Mushroom_leather

Personally I find it odd to call polyurethane “vegan leather”. Seems like a marketing ploy to make it sound new. I’ve always known it as “pleather” or “faux leather”. https://en.wikipedia.org/wiki/Artificial_leather

permalink
report
parent
reply

Probably ~15TB through file-level syncing tools (rsync or similar; I forget exactly what I used), just copying up my internal RAID array to an external HDD. I’ve done this a few times, either for backup purposes or to prepare to reformat my array. I originally used ZFS on the array, but converted it to something with built-in kernel support a while back because it got troublesome when switching distros. Might switch it to bcachefs at some point.

With dd specifically, maybe 1TB? I’ve used it to temporarily back up my boot drive on occasion, on the assumption that restoring my entire system that way would be simpler in case whatever I was planning blew up in my face. Fortunately never needed to restore it that way.

permalink
report
reply

Hopefully they have better defenses against legal action from Nvidia than ZLUDA did.

In the past, re-implementing APIs has been deemed fair use in court (for example, Oracle v Google a few years back). I’m not entirely sure why ZLUDA was taken down; maybe just to avoid the trouble of a legal battle, even if they could win. I’m not a lawyer so I can only guess.

Validity aside, I expect Nvidia will try to throw their weight around.

permalink
report
parent
reply

Exactly. My nightstand has a plain ol’ USB 2.0 slow charger, which is plenty to get it fully charged overnight. On light days, that’s all I need.

I have a USB-PD charger on my desk and in my travel bag, which I’ll use to get me through the day as needed, but my phone only takes 20W max IIRC. There’ve certainly been times when traveling when I wished it could charge much much faster, because I don’t have access to power for long stretches of time.

I used to have a OnePlus phone, which used SuperVOOC instead of just USB-PD. Much faster and I never had heat issues. I’d love to see SuperVOOC adopted more outside of China. I think OnePlus is the only brand with SuperVOOC sold in my country.

permalink
report
parent
reply

I see that lemmy.sdf.org gets its cert from Let’s Encrypt, and it renews in 60-day increments. Is it possible to have it auto-renew a week in advance of expiration?

permalink
report
reply

Off the top of my head I’m not sure why that would be. To troubleshoot, it might help to print the output every step of the way so you can see if there are any oddities. Something like this perhaps, in place of the FIRST_DATE= line.

echo $file_path
EXIF_OUT=$(exiftool -m -d '%Y%m%d' -T -DateTimeOriginal -CreateDate -FileModifyDate -DateAcquired -ModifyDate "$file_path")
echo "$EXIF_OUT"
EXIF_FILTERED=$(echo "$EXIF_OUT" | tr -d '')
echo "$EXIF_FILTERED"
FIRST_DATE=$(echo "$EXIF_FILTERED" | awk '{print $1}')
echo "$FIRST_DATE"
permalink
report
parent
reply

Glad it’s working! Couple more quick ideas:

Since you’re looping through the results of find, $file_path will be a single path name, so you don’t need to loop over it with for images in $file_path; anymore.

I think you’re checking each field of the results in its own if statement, e.g. if [[ $(echo $ALL_DATES | awk '{print $1}')... then if [[ $(echo $ALL_DATES | awk '{print $2}')... etc. While I don’t think this is hurting performance significantly, it would make your code easier to read and maintain if you first found the correct date, and then did only one comparison operation on it.

For example, exiftool -m -d '%Y%m%d' -T -DateTimeOriginal -CreateDate -FileModifyDate -DateAcquired -ModifyDate "$file_path" returns five columns, which contain either a date or “-”, and it looks like you’re using the first column that contains a valid date. You can try something like this to grab the first date more easily, then just use that from then on:

FIRST_DATE=$(exiftool -m -d '%Y%m%d' -T -DateTimeOriginal -CreateDate -FileModifyDate -DateAcquired -ModifyDate "$file_path" | tr -d '-' | awk '{print $1}')

tr -d '-' will delete all occurrences of ‘-’. That means the result will only contain whitespace and valid dates, so awk '{print $1}' will print the first valid date. Then you can simply have one if statement:

if [[ "$FIRST_DATE" != '' ]] && [[ "$FIRST_DATE" -gt $start_range ]] && [[ "$FIRST_DATE" -lt $end_range ]]; then

Hope this helps!

permalink
report
parent
reply

I have not tested this, but I have a couple ideas off the top of my head.

#1 - Retrieve all fields with a single exiftool command. e.g. ALL_DATES=$(exiftool -m -d '%Y%m%d' -T -DateTimeOriginal -CreateDate -FileModifyDate -DateAcquired -ModifyDate "$filename")

Then retrieve individual fields from $ALL_DATES with something like awk. e.g. echo $ALL_DATES | awk '{print $1}' will return the first field (DateTimeOriginal), and changing that to ‘{print $2}’ will return the second field (CreateDate).

#2 - Perhaps process multiple files with a single exiftool call. e.g. exiftool -m -d '%Y%m%d' -T -DateTimeOriginal -CreateDate -FileModifyDate -DateAcquired -ModifyDate ~/Pictures/*. You might compare whether running just this exiftool query once vs running it in a loop takes a significantly different amount of time. If not, it’s probably simpler to use one call per file.

Edit: I doubt the either find or globbing will use a significant amount of time, however, the issues you have with find and spaces in file names can be worked around by using find’s -print0 option. This prints out file paths separated by NUL bytes (i.e. ASCII value 0). You can then loop through them without needing to guess when whitespace is part of the path vs a delimiter. A common way of dealing with this is to pipe the output of find into xargs like so: find ~/Pictures -type f -print0 | xargs -0 -L 1 echo 'File path: '. That will execute echo 'File path: ' <file> for every file in your Pictures folder. It’s a little more complicated. You can also use a for loop like so:

find ~/Pictures -type f -print0 | while IFS= read -r -d '' file_path; do
    echo "Processing: $file_path"
done

Note that when you pass a blank string with read -d '', it reads to a NUL char, as documented here: https://www.gnu.org/software/bash/manual/bash.html#index-read . I’m not 100% sure if this is true in older versions of Bash or other similar shells.

permalink
report
reply

T-Mobile has gone nuts raising prices since the Sprint merger, even prices for plans they very loudly advertised as being locked in for life. See: https://arstechnica.com/tech-policy/2024/06/t-mobile-users-thought-they-had-a-lifetime-price-lock-guess-what-happened-next/

permalink
report
reply

Do I need a 20TB boot drive? No. Do I want it enough to pay $250? Yes, absolutely. I’m running 1TB now and I need to manage my space far more often than I’d like, despite the fact that I keep my multimedia on external mass storage. Also, sometimes the performance of that external HD really is a hindrance. I’d love to just have (almost) everything on my primary volume and never worry about it.

It’s kind of weird how I have less internal storage today than I did 15 years ago. I mean, it’s like 50 times faster, but still.

I’m not super-skeptical about the pricing. This stuff can’t stay expensive forever, and 2027 is still a ways off.

permalink
report
reply