So what's slowing down the average consumer use cases to be this shit that was the same for 20 years? Like they wouldn't be able to get by if PP were in like 3D or AI was part of the equation, or any other idea that I can't think of that made that shit more resource-intensive?
I guess it's like displays, and how it was fine for years to do 1080p/60hz, and then all of a sudden higher display tech exploded and we all want faster refresh rates and higher and higher resolution. What was the catalyst for that? Why was 1080/60 all of a sudden no good? Will the same happen to the average consumer mundane tasks used on those shit box $300 laptops? Will we get to the point that a 3080 is required to run PowerPoint because there are some bitchin' presentations that come out?