east of sunset
Apr. 30th, 2024 08:51 pmDang, it was pretty out tonight:
I took a few other photos too, they’re on my Mastodon account. Enjoy.
Posted via Solarbird{y|z|yz}, Collected.
Dang, it was pretty out tonight:
I took a few other photos too, they’re on my Mastodon account. Enjoy.
Posted via Solarbird{y|z|yz}, Collected.
Not the article, it’s fine. It’s the photos. Mostly the top one.
It looks fake.
There’s a credited photographer: Ann Hermes. She’s 100% real, a working professional, she’s done a lot of work for a lot of clients, she’s easy to find.
And I don’t know what happened – what tool caused it or what – but the lead photo shouts AI PHOTOGRAPHY at me.
There are a few reasons for it – the central figure’s facial and head hair is a big one, something about the antique touchtone telephone on the left really bugs me (proportions? size? I can’t tell?), the lighting just feels odd, and some letters are kinda fucked up) – but the details aren’t important. That’s not what bothers me.
What bothers me is that this is a most-likely real photo that’s making me think it’s an AI-generated fake. And it bothers me because…
…see…
if people are going to start using tools on real photos that make them look a little fake, particularly if the ways they look a little fake are the same way that AI-generated actual fakes look fake, then that’s incredibly bad.
So far we’ve been lucky enough in ’24 not to have to deal with really good fakes. We’ve mostly had trash fakes. But if the real photographers start to use tools that make their real photos look fake in the same way as AI renderings, then these AI renderings not being really good fakes stops mattering very much – if at all. We lose that little bit of edge we had against AI-driven disinformation.
I mean, maybe that’s not what happened here. Maybe his hair really is that kinda weirdly defined and she did some HDR tricks to bump up the definition and contrast not even using AI tools. Maybe she corrected some lens distortion and that’s why the phone looks funny. Maybe she boosted the black level a little to de-emphasise some dark areas – it’s tempting, I’ve done it, but you have to be careful with it or it looks weird. Maybe the light just was a little odd, and/or she set up some reflectors to make it that way because the scene was too high contrast otherwise. Or maybe she did it afterwards using ordinary levelling tools.
But however it happened – and whoever did it, be it the photographer, the layout artist, the web designer – it still came out setting off my AI alarms.
And that’s incredibly bad.
Fuck.
Posted via Solarbird{y|z|yz}, Collected.
YEP IT COLD
But it’s also dry and exceptionally clear. The low “overnight” was like at 10:30AM which is kinda nuts, at roughly -9°C, which is not as bad as they got on the other side of the border but is still Cold Enough For Me, Thanks.
Mostly the last couple of days tho’ I’ve been at the 3D printer, because Anna got me a device called a BL-Touch, which is a sensor that lets the printer discover the print bed level – and any irregularities in it – by itself, using a little probe that goes around the print bed in a grid making measurements.. (BL-Touch = Bed Level by Touch, you see. I have to explain that because BL has rather different meanings in some places, particularly Japan I am just saying, and it’s always a little yikes to me as a result.)
I also replaced three failed bearing wheels that absolutely should not have failed but did, and in the unlikely event you’re one of the nontrivial number of people who downloaded my nozzle size dial, you’ll want to know that I had to make a new one that’s BL-Touch compatible. I found this out by not knowing my old design wasn’t BL-Touch compatible, turning the printer on, having it start a level pass, and then slamming the hot end continuously against the far left side of the hot end gantry in a way that sounded like a garbage disposal eating its own bearings until I could turn the stupid thing off.
So that was exciting.
The printer is fine, thankfully. The BL Touch is really nice and I’ve managed to improve the flatness of my printer bed using aluminium foil as a spacer, thanks to being able to see the irregularities. I was able to get it to 0.05mm, which I think is pretty good.
Anyway, I’ve got enough material for a few Fascism Watches, I’ll drop more soon. Also this is the first test of my edited “can we make Featured Images look less stupid on Dreamwidth?” functionality. Let’s see how it looks!
eta: TALL HOLY SHIT TALL let’s see if I can fix that lol
eta2: YAY I can! Okay, I think this’ll do. ^_^
Posted via Solarbird{y|z|yz}, Collected.
Anna and I went to check out the new wetlands park down by Sammamish (a bunch of photos here) and it was really nice despite not being really grown-in yet. That’ll take a little while, since they’re going full-native-plantings on it, but it’s good, I like it. Particularly the elevated pathways that let you get out onto the peninsula between Swamp Creek and the Sammamish River.
But we stayed kind of longer than I thought we would, and since I wanted to bike to Bothell (the next town over) to buy some items I can’t get at either closer grocery and get home before it got dark, I turned on the electric assist and leaned in a bit.
Normally I don’t use the assist on this trial at all, right? But I was realising that I was pretty high up the gears and something felt a little different, and looked down to see that I was going over 20mph, which means all the assist had turned itself off, as it’s supposed to do at or around 20mph on a Class 1 ebike.
And now I know what that feels like. I think it ramps down intelligently starting at like 19.5mph. It felt very smooth. (And I’m using miles because that’s what the rules here are written in.)
I think that’s maybe the fastest I’ve gone on a mild incline. I’ve gone faster certainly downhill, but this was flat to very slightly uphill, so I’m pretty pleased with myself over that. And I was certainly surprised.
Anyway, I got to Bothell and then got back home well before dark, even if some lights were turning on, and the frozen coldpack I’d thrown into the insulated bag on the way out showed no signs of melt at all, which is really as I’d expected but it’s still nice to have it play out.
Then I got home and made another iteration of my newest object design, which is printing out now. It’s another accessory for the vertical project mounting system, but I’ll post about that later.
Posted via Solarbird{y|z|yz}, Collected.
Yeah bike definitely needed that second round of cleaning and I do think I like this Effetto wax more than the Rock n Roll.
I mean, this was also after realigning the rear derailleur, which absolutely didn’t hurt anything, and also I absolutely did a better job of cleaning the cartridge the second time than I did the first, even if the first time got out 90% of the dirt and crud. But still.
I guess the best way to put it is that I kept glancing at the gear display thinking I was one gear lower than I actually was. The first time made everything much smoother; after the second cleanup, it basically bought me an entire gear.
Also, the shifting was so smooth and quiet I didn’t even always hear it. I had to check a few times, that’s how smooth and quiet it was.
I did have a little noise on one gear start to show up early in the ride but a little fine tuning at the halfway point sorted it out completely and everything else was fine so that was nice.
sooooooo smoooooooth
Here, have a fall photo from the Sammamish River Trail:
Posted via Solarbird{y|z|yz}, Collected.
It’s been a while since I posted a flower picture, so here y’go. This was taken on a hillside above the Oregon coast.
Mirrored from Crime and the Blog of Evil. Come check out our music at:
Bandcamp (full album streaming) | Videos | iTunes | Amazon | CD Baby
I’ve started a special-purpose Tumblr blog dedicated to an old newspaper I found being used as packing material at an estate sale. It’s called Seattle—July 20, 1971 (or “Let’s Read the Newspaper!”) and it’s photos of pages, ads, ephemera, and mostly-local-news articles from salvageable pages of that newspaper.
I won’t be crossposting that here, so if you want to follow it, go follow it separately, either on Tumblr or via its own RSS feed.
Mirrored from Crime and the Blog of Evil. Come check out our music at:
Bandcamp (full album streaming) | Videos | iTunes | Amazon | CD Baby
I played more with the panorama function on iOS 7 last night. It appeared to assume that you’d stand in place and turn, which is how most people do it, but I wanted to see how it would work if you didn’t do that, but instead scooched along in a straight line, to the side.
It really doesn’t expect you to be doing that. Check these out and click for larger:
The above looks pretty much right. I’m not practiced with it and the light could’ve been brighter, but you get the idea. Items look the correct size and shape, really, and the seaming is handled very well.
Now check it when you don’t do what they expect and slide from left to right:
Look how wibbly and bent things get! Particularly the compost bin – that’s the silver cylinder on the countertop. Is that cool or what? I suspect there’s some insight into their algorithmic assumptions here.
So, yeah. Not built for this purpose.
Oh, the stand-in-place version is cropped, because it came out to a higher total vertical resolution for some reason – 2468 pixels high. The slide version used I think the whole width, or close to it, and came out to 8627 horizontal. That’s a pretty high resolution pan. It is kind of noisy, tho’; I’d like to see how it does in good light. I suspect it’s optimised for outdoors.
Mirrored from Crime and the Blog of Evil. Come check out our music at:
Bandcamp (full album streaming) | Videos | iTunes | Amazon | CD Baby
The new iOS 7 panorama UI is perhaps the best I’ve ever seen, and it works great. You start, you pan slowly right, you tell it when you’re done, and there’s a UI to show you if you’re moving up and down or if you’re going too fast.
But since it works by knitting together new slices of images as you turn/move, you can screw with it by walking around. :D
Those were taken with Paul’s iPhone 4. My iPad Mini doesn’t support this feature. I may need a new phone now. Goddammit.
Mirrored from Crime and the Blog of Evil. Come check out our music at:
Bandcamp (full album streaming) | Videos | iTunes | Amazon | CD Baby
For some reason, I feel like talking about photography.
Here’s a shot I took from Butchart Gardens, outside Victoria, weekend before last:
I was going for that kind of 40s or 50s-holiday feel, an older boat, a dock that’s actually pretty new but looks older because of the sepiatone, and all that. I’m pretty happy with it. It has a 1930s feeling that I get from looking at land photos from the era.
But all of that was post-photo, because this was originally a shot with different intent – an intent that didn’t work. At all. Here’s the original:
(Technically speaking, that’s the next shot, but it’s pretty much identical.) I was trying a couple of experiments that failed in the same way, but I didn’t delete the shots from my camera, and ended up with the sepia faux historical.
In terms of mechanics, getting to the above from the below was all in iPhoto, but it works the same in Photoshop. iPhoto has a lovely biased centre-of-brightness tool they call “Shadows,” and another one biased differently called “Highlights.” The first makes shadows brighter, the second brings down highlights, and in both cases, they’ll reveal lots of lost detail if you crank them way the hell up.
The problem with this approach is that no matter what, you’re missing a lot of colour data. You just don’t have it – at least, not in usable resolution. The resulting images often look washed out and/or really grainy. This original, treated thusly, looks really washed out:
…which is where monochrome comes in. I went with sepia/amber here to invoke a mood, but standard black-and-white would’ve worked about as well. If you merge the colour data to a monochrome palette, you get back to a similar amount of intensity data as you’d've had if you’d shot the image in black-and-white to start. It looks natural, within the artifice of photography.
I’ve pulled out a fair number of concert shots this way, and night crowd shots. You get this old-school newspaper/disco kind of look. I’ll even turn up the graininess on purpose, to drive that home. And with that, a shot that looked lost can be made vibrant and interesting again.
C.f. this crowd shot, at Strowler Nights, a few years ago:
That was basically a black rectangle with highlights, on my camera. But play with the levels, edit out a stray arm in the lower left, take out the colour, and: result!
If I’d left it in colour, it would’ve been – at best – a washed-out mess with hints of colour. But taken to monochrome, and kicking up the grain so it looks intentional, and you end up with a textured crowd portrait.
Which I guess really means I didn’t want to talk about photography, I wanted to talk about art, and intent. To wit: a lot of things you think of as flaws or problems can become assets, if you just turn them up to the point where they look intentional, then fine-tune them a bit. Not everything, gods know. But a surprising number.
If you’ve done something like this, post links or descriptions, eh? Share your mistakes-turned-successes. It might be fun. ^_^
Mirrored from Crime and the Blog of Evil. Come listen to our music!