• Oct. 2, 2023, 3:58 p.m.

    This year, we have had some vast "improvements" in photo software, mainly in using AI to do things much more easily (or so it is claimed). However, what I am finding is that the underlying hardware I now need to run things like Capture One and ON1 need to be so much more powerful. I have what I thought (with my previous s/w) was a perfectly acceptable system. A decent I7 8700 processor, 16Gb memory and a Nvidia 1060 graphics card.

    But no - it takes many seconds to process changes and I am having to look at spending thousands on upgrading if I want to get back to anywhere near an 'instant' result of my changes. I don't want to do this - my hardware is fine for other things (email, spreadsheets, word processing, web design) but if I want to experiment with photo adjustments and still have time for a life, I have to do it.

    Am I alone in this or do you all have much better h/w than me?

    Alan

  • Removed user
    Oct. 2, 2023, 4:14 p.m.

    An interesting post indeed.

    As an amateur who does not print, I am sticking stubbornly to my 20 year-old 3.4MP Sigma SD9 and opening it's raw-only files with an old version of Sigma's converter. As to AI and stuff, I use the free GIMP and RawTherapee - updating them occasionally. Both still fast enough on my i5 Dell desktop machine.

    Totally disinterested in Topaz, et al.

    As to my little-used 20MP Lumix G9, I will not be upgrading to the G9 II - a waste of money for me.

    My hardware is old by today's standards and will continue until it dies ...

  • Members 1553 posts
    Oct. 2, 2023, 4:15 p.m.

    We all live the same story, always more (to wash whiter than white)
    I'm feeling overwhelmed by the new gadgets, and that since 1979, almost 45 years of technological developments.
    And that's just the beginning. We always end up finding a justification for spending tens of thousands of dollars.
    Good luck 😈

  • Oct. 2, 2023, 4:26 p.m.

    You are not alone.

    My old'n'good i5 + onboard video was pretty usable for my old Sigma - but for Fuji x100v it definitely feels underpowered (slooooow). Maybe problem is in Silkypix (it is not the fastest software after all), but 5 times more megapixels to process is certainly slower in any software (Silkypix for Sigma was quite usable).

  • Members 196 posts
    Oct. 2, 2023, 4:58 p.m.
                   I was also  using a soap powder analogy 😀 in a thread elsewhere. It is getting a bit silly with some reviewers product A had perfect flawless whatever feature , but mk II of the product is even more perfect even more flawless . No doubt mk III , mk IV and mk V will be better still 😀 I think unless you shoot in a very specific narrow niche that we are way beyond the good enough level .
    
  • Members 196 posts
    Oct. 2, 2023, 5:02 p.m.

    Calling silkypix "not the fastest software" is remarkably generous 😀 Jesting aside more and more of the modern software's AI shenanigans are very much GPU dependent

  • Foundation 1502 posts
    Oct. 2, 2023, 6:08 p.m.

    In January 2021 I upgraded my desktop from one I built in 2010. This was solely because of the needs of photo processing software. The old computer had no GPU, with graphics on the board. I went to the darkside, and bought an AMD Ryzen 9 3900X CPU and AMD Radeon Pro W5500 GPU, 32 GB RAM and M.2 SSD drives. I dont know how this compares with Alan's set up; but it is still running as fast as I need. I agree with Jim Stirling that today's photo processing software seems to be as much as I need (I use PhotoLab, Topaz Sharpen AI and Photoshop CS5, and dont have to wait on them to complete their tasks).

    When Alan says that his machine is quite adequate for "email, spreadsheets, word processing, web design", I would point out that, apart from enormous spreadsheets, these tasks are not particularly demanding -- unless one insists on using programs with unnecessary bells and whistles!

    David

  • Members 1578 posts
    Oct. 2, 2023, 6:49 p.m.

    Mac user here. Last winter, when Photoshop began to lag and give me messages that the latest versions, which I pay $10 a month to have, would no longer work, I laid my 12 year old mac mini to rest, and got a new mac studio with twice the ram. It is remarkably fast in whatever I've set it to do. I've yet to upgrade the 27" monitor, as the old one, at age 15, still soldiers on, is quite lovely, and allows me to have plugs for all the old outdated peripherals I don't yet want to replace. Contrary to warnings I had read, the old monitor doesn't slow anything down. Everything works, and works fast. My software is pretty varied: Lightroom, Photoshop, Topaz, DXO, Luminar, OnOne, Corel Painter, and the usual productivity apps. The AI stuff in Photoshop takes only a few seconds to generate but I haven't asked it to do anything serious. I have not become enamored of AI (yet) but it isn't lag that keeps me from it, it's rabbits with 3 ears and people with 7 fingers on a hand. I do use it for background fill sometimes.

  • Members 244 posts
    Oct. 2, 2023, 6:58 p.m.

    You’re not alone but what you describe has happened to me in a repeating cycle for 40-years. This is just another cycle, IMO.

  • Members 861 posts
    Oct. 2, 2023, 11:15 p.m.

    I've avoided this problem completely by working in video production first, meaning I can't skimp on processing power. I'm always fearful giving out GFX raw files because most people's systems just aren't built for that.

    Not the answer you might wanna hear but, some processes with the GFX cause things to slow down a bit. My solution - make your image smaller. Not sure what your working with or where it's going, but the PS ai tools are low quality currently. You're probably fine slicing your file sizes in half first, to help speed things up unless you're trying to do big prints.

  • Members 317 posts
    Oct. 3, 2023, 1:42 p.m.

    Yep, just another cycle. However, I find my MBP 14 (about a year old) with the M1 Pro3 and 32 GB is pretty future proof (at least for a few years). Apple made the choice to integrate multiple AI optimized cores in the new chips using shared memory with the GPU and processor. AI applications are dominated by extensive linear algebra calculations (tensor manipulation) and Tensor Flow is an optimized tool for such processing and optimized processors can be a big help. As users demand more out of their processing S/W, vendors will respond and the more we ask the more stress we put on the H/W. On the other hand I have noticed that in C1 the masking tools have evolved much by the use of some AI techniques over the past year. They are promising a major upgrade in the next version. No mater what computer one has - at some point the S/W will expand exceed its capability. That is a corollary of "Moore's Law." 😉

  • Oct. 3, 2023, 3:30 p.m.

    I'm sticking to Windows. I've got too much invested in that side of things to change.

  • Members 244 posts
    Oct. 3, 2023, 8:40 p.m.

    Totally understandable. I switched from a PC environment to a Mac environment about 15 years or so ago. Having used both environments for decades, I do think that, for ME, Macs last a bit longer in the cycle that you are describing.

    However, if I was still in the Windows environment and had lots and lots of devices all tied together in that environment, I doubt that I would switch to a Mac either.

  • Members 746 posts
    Oct. 4, 2023, 9:27 a.m.

    Once it gets to the point of requiring a supercomputer to process a few shots, it's not photography any more to me. I think you've then lost sight of the picture taking or image capture aspect of it all, and pretty much reduced it all to electronic manipulation of data. Which is not photography. Far from it. Go back to the basics. Keep it simple, actually see & look at what you're taking pictures of. Stop micro analysing the data. See the big picture. You'll be far happier that way

  • Oct. 4, 2023, 10:13 a.m.

    Yep - I fully understand that philosphy and am trying to change my mindset to do that. It takes time...

    Also, I LIKE post processing stuff to produce an image that was in my mind but didn't quite make it to the sensor. Classic example is the sky - sometimes, it's not quite how I wanted it, but an AI change of the sky to a different one can make all the difference (to me).

    Alan

  • Members 746 posts
    Oct. 4, 2023, 11:04 a.m.

    Yeah, I get that. I'm not knocking it at all. It's just that for me personally, the less capable the gear is, the more I seem to enjoy using it. I keep losing sight of that, get carried away buying more capable gear again, and realise I'm not enjoying it as much. So I downgrade a bit. Or a lot. And the cycle continues.
    Currently I'm thoroughly enjoying my little plastic IBIS-less G100, manual focus Laowa 10mm or PL 9mm, & Lumix PZ 14-42 kit zoom. Way more than I thought I would, as the G100 & PZ kit zoom are some of the most despised bits of gear on the interwebz. Very liberating, and complete lack of self imposed pressure. My post processing consists of sitting on my Sofa with my tiny little NUC and a monitor perched on the coffee table, & pressing the auto fix button in Elements lol lol. If I'm spending more time fiddling on the computer than actually taking pictures, then I tell myself I'm doing it wrong. You've just got to sort out what you really do enjoy doing, and not get sidetracked ha ha ha.
    Once again, I get that lots of people love doing that. (the processing bit)

  • Members 244 posts
    Oct. 4, 2023, 5:14 p.m.

    While I understand your sentiment, you are already required to use a “super computer” and have been for quite sometime.

    blog.adobe.com/en/publish/2022/11/08/fast-forward-comparing-1980s-supercomputer-to-modern-smartphone

    Your sentiment, I believe, is one of the reasons behind the popular resurgence of film. Whether that trend actually takes hold is another matter entirely.

  • Foundation 1502 posts
    Oct. 4, 2023, 6:03 p.m.

    I dont see why one would revert to film, if one wishes not to get involved with post processing. Just dont activate raw files and accept what the OOC jpegs look like!

    David