Because I am always curious bolded would like to know what that means being a non-photog person and how it relates to gpu processing.
At a basic level, raw parallel processing power. GPUs are designed to handle multiple tasks simultaneously through parallel processing. That architecture is particularly well-suited for AI and high-performance computing workloads, which often require performing many calculations concurrently.
AI and high-performance computing tasks commonly involve very complex mathematical operations. GPUs are optimized for these operations and can perform them much faster than CPUs, and that acceleration significantly reduces the time required to complete computations. Better GPUs allow you to complete AI tasks, applied across a large data set, much faster.
Deep learning, a subset of AI, involves training large neural networks with millions/billions of parameters. Training such models is computationally intensive and can take weeks or months on a CPU alone. GPUs accelerate this training, and you can train complex models in a reasonable timeframe.
Beyond training, GPUs are also crucial for model inference, where a trained AI model makes predictions on new data. Real-time or near-real-time inference is useful for a wide range of applications, like autonomous vehicles, natural language processing, and image recognition. GPUs can handle inference tasks efficiently, i.e. low-latency responses.
All of these capabilities are leveraged in recently introduced photo editing software, which takes RAW files and infers what individual pixels should be when the image may be slightly out of focus or where detail originally captured is imperfect. AI algorithms can also help tremendously when trying to reduce noise, if you want to do that (although many don't like messing around with noise reduction too much and like a little bit of noise in their photos. YMMV.)
The effects can be stark, especially with old RAW files from say 2005-2010. I have a large collection of images that I want to run through AI photo software. Small batches can take many hours to process without a burly GPU. That's oversimplifying a lot, but I think that's the gist of it.
The myriad uses for GPUs is why it's been so hard to find GPUs for simple gaming at MSRP since before the pandemic. First mining and now AI have created massive non-gamer demand for the top-end nVidia cards especially.