Modern software sucks, and computing is garbage while hardware is blazing new frontiers.

AladainAF

Best Rabbit
<Gold Donor>
12,864
30,813
So I've become enamored with a youtube channel run by some guy who makes the case that computing and software is absolute dog shit today. He brought up many points about this, concerning things like still typing into word processors today and the text sometimes will lag, or it will take time to format properly while typing. And yet, 30 years ago to accomplish a near similar task (writing in a document) there were no issues. Likewise with things like loading programs. Photoshop in 2021 takes over 15 seconds to load up on screamingly fast hardware, but 20 years ago on 20 year old hardware, it took less time, despite it bring the same program and just less features. While they're not the same feature set, that in no way accounts for the increased time especially when hardware has increased exponentially in that time.

His idea is that there's just code on top of code on top of code, and it's a never ending cycle. The vast majority of people don't want to write their own code - but instead use someone elses libraries and code. Despite the gains in hardware being absolutely insane in the last 20 years, software and computing in general has gotten worse and worse and plagued with more and more problems. Developers make excuse after excuse, but none really want to acknowledge that this is a real problem. Problems such as webpages that wont load properly, long load times, updates happening in the middle of process flows, and many of these things are having real-world issues even though it wouldn't have happened a few decades ago.

His latest video showcases Windows Terminal, and how it's so unbelievably painfully slow, only because they simply DirectDraw everything. So when terminal is displaying a lot of text, your system is spending a vast majority of its resources performing worthless code that does nothing to accomplish the task at hand.

So anyway, point of the thread: Open discussion on this, get thoughts/opinions from other coders, and introduce to this channel that I've been following lately.

 
  • 4Like
  • 1Solidarity
Reactions: 4 users

slippery

<Bronze Donator>
7,892
7,705
His idea is that there's just code on top of code on top of code, and it's a never ending cycle.
I mean it's most definitely just this. Everything is decades of spaghetti at this point, because no one wants to take the resources to rewrite it and do it correctly instead of the mountain of band-aids it is currently.
 

Lanx

<Prior Amod>
60,730
134,003
So I've become enamored with a youtube channel run by some guy who makes the case that computing and software is absolute dog shit today. He brought up many points about this, concerning things like still typing into word processors today and the text sometimes will lag, or it will take time to format properly while typing. And yet, 30 years ago to accomplish a near similar task (writing in a document) there were no issues. Likewise with things like loading programs. Photoshop in 2021 takes over 15 seconds to load up on screamingly fast hardware, but 20 years ago on 20 year old hardware, it took less time, despite it bring the same program and just less features. While they're not the same feature set, that in no way accounts for the increased time especially when hardware has increased exponentially in that time.

His idea is that there's just code on top of code on top of code, and it's a never ending cycle. The vast majority of people don't want to write their own code - but instead use someone elses libraries and code. Despite the gains in hardware being absolutely insane in the last 20 years, software and computing in general has gotten worse and worse and plagued with more and more problems. Developers make excuse after excuse, but none really want to acknowledge that this is a real problem. Problems such as webpages that wont load properly, long load times, updates happening in the middle of process flows, and many of these things are having real-world issues even though it wouldn't have happened a few decades ago.

His latest video showcases Windows Terminal, and how it's so unbelievably painfully slow, only because they simply DirectDraw everything. So when terminal is displaying a lot of text, your system is spending a vast majority of its resources performing worthless code that does nothing to accomplish the task at hand.

So anyway, point of the thread: Open discussion on this, get thoughts/opinions from other coders, and introduce to this channel that I've been following lately.

This faggot sounds like the hippy soyboys who cant cook for shit and pretend to be cuisinists by doing deconstructed food
 

Aldarion

Egg Nazi
8,945
24,468
His idea is that there's just code on top of code on top of code, and it's a never ending cycle. The vast majority of people don't want to write their own code - but instead use someone elses libraries and code.
My experience is with coding for bioinformatics, not software design. But in my experience this statement is absolutely true.

Its not even viewed as laziness much of the time. They just say "why reinvent the wheel?" and install a complex third party library or module instead of taking the time to write the simple piece of code they actually need.

So while I can't speak from experience about whether it creeps into software there are a lot of genuine CS types that say this. So I bet it does.
 

Fucker

Log Wizard
11,567
26,182
Adobe is the worst out of them all. It has Creative Cloud, which is fragile, slow, and spams the fuck out of your system with a bunch of random shit. I switched to an older, pre-CC version and also use Affinity Photo in its place.

Adobe's worst crime is is still doesn't do multicore anything. Compressing 1gb of JPG to a small size takes well over 1 minute 41 seconds. The same operation in Affinity Photo takes....16 seconds.

Windows is trash these days. I know you can turn off windows animations and effects, but it's just slow. Slow everywhere. Saving a pic to the drive? I can count seconds before it appears on my desktop. Sure, it's being filtered through the canned AV, but it is still too slow. It has context menus from every OS dating back to Windows fucking 3.1...and Windows 11 will add another pile of shit on top of that, too.

I love my M1 Macbook Air and new Ipad. Whistling around the UI is crisp and instant. Sure, the M1 can poke around for a bit launching a program, but I'll take it and its fast UI over Windows all the time. The Ipad used to be kinda pokey, but it is every bit as fast whistling around the UI and even general computing. It's faster than my 4790K was in terms of CPU power. I had to hang a giant Noctua onto the 4790K to cool it, and it still ran too hot if I pushed it really hard. Ipad sips power and doesn't get hot.

In terms of CPU power, both Intel and AMD are in the dark ages. The M1 has 4 small and 4 big cores. The 4 big cores and keep up and even surpass my 8 core 3700X in some tasks, all the while using a fraction of the power and doesn't need a cooling fan at all.

But, yeah....software has been trash for a while and getting worse. Lazy programmers and their bloated fucking libraries.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,414
22,202
There's a lot of complete unoptimized shit, not just on the consumer or pro-sumer level, but even enterprise SAAS or infrastructure products.

Like holy shit, the Avaya trash that I work on for a living. This is software that ran just fine on integrated circuits in a Lucent phone switch nearly 30 years ago and yet sometimes still hangs up and crashes on the latest greatest datacenter hardware.
 
  • 1Like
Reactions: 1 user

Deathwing

<Bronze Donator>
16,403
7,399
Good software development should include automated performance regression testing and profiling. I manage QA for static analysis tool and we routinely benchmark it against large codebases like Android. Deviations in time outside of 5% usually require investigation so that cause can at least be explained and then decide whether it's worth fixing.

Automated testing is cheap once you pay the upfront setup cost. Committing engineers to investigating and fixing problems is expensive, it's easy to see why shit falls through the cracks.
 
  • 3Like
Reactions: 2 users

Khane

Got something right about marriage
19,836
13,354
20 years ago webpages and software were written to perform a specific function and you paid for the software and got free support. Now they are written to collect as much data from their users as possible while sort of doing the task they were designed for to keep subscription/service models viable by making it so support is necessary.

I'd wager that 20 years ago people who were coding and creating software actually enjoyed it and had curious/creative minds because it was something they were genuinely interested in doing and was still a fairly new career path. It has since become such a lucrative career path there are likely many people who don't care how it works as long as it "works" so they can collect a paycheck and that's why they got into it.
 
  • 7Like
Reactions: 6 users

Intrinsic

Person of Whiteness
<Gold Donor>
14,264
11,711
I need to watch some of that guy’s channel maybe. This was something I was thinking about last weekend bc my computer was churning and I noticed that Brave was taking up 98% of my RAM. Normal amount of tabs open (so you know like 280, not counting porn). One page was eating up 8 gig of RAM. The internet worked just fine on angelfire and geocities. Now with No Script and whatchamacallit it is easy to see that every page is running 48,000 scripts and redirects just to help you see some 40 year old pretending to be someone’s 18 year old step sister.
 

Voyce

Shit Lord Supreme
<Donor>
7,119
23,269
Reading title at face value, yes... Modern Software is over engineered dogshit

Gmail Web Browser, actively fights with you, to stop you from doing the most basic web 1.0 shit et. al.
 

Izo

Tranny Chaser
18,522
21,371
20 years ago webpages and software were written to perform a specific function and you paid for the software and got free support. Now they are written to collect as much data from their users as possible while sort of doing the task they were designed for to keep subscription/service models viable by making it so support is necessary.

I'd wager that 20 years ago people who were coding and creating software actually enjoyed it and had curious/creative minds because it was something they were genuinely interested in doing and was still a fairly new career path. It has since become such a lucrative career path there are likely many people who don't care how it works as long as it "works" so they can collect a paycheck and that's why they got into it.
Indeed. WinRAR is still free, even the Hebrew v 6.00. And optimized for the fancy 64bit platform too.

1626023809403.png

IT in a nutshell too.
 

LiquidDeath

Magnus Deadlift the Fucktiger
4,899
11,321
Good software development should include automated performance regression testing and profiling. I manage QA for static analysis tool and we routinely benchmark it against large codebases like Android. Deviations in time outside of 5% usually require investigation so that cause can at least be explained and then decide whether it's worth fixing.

Automated testing is cheap once you pay the upfront setup cost. Committing engineers to investigating and fixing problems is expensive, it's easy to see why shit falls through the cracks.

Not just this, but any kind of oversight whatsoever. I work in Change Management and the developers at my company actively work against any attempt to hold them the least bit accountable for ensuring any quality control or planning. The dev leads even go as far as to suggest that we should be even more lenient in our approvals to ensure they can deploy as quickly as possible. It is the most ridiculous shit I can think of, but it has been the same for my entire 15 years at the company.
 

Deathwing

<Bronze Donator>
16,403
7,399
Not just this, but any kind of oversight whatsoever. I work in Change Management and the developers at my company actively work against any attempt to hold them the least bit accountable for ensuring any quality control or planning. The dev leads even go as far as to suggest that we should be even more lenient in our approvals to ensure they can deploy as quickly as possible. It is the most ridiculous shit I can think of, but it has been the same for my entire 15 years at the company.
What exactly is change management? To me it sounds like someone dedicated to reviewing and approving commits.
 

jooka

marco esquandolas
<Bronze Donator>
14,413
6,131
You also get lots of turning a single click into three in order to get a promotion type shit that does exactly nothing but fuck with workflows to show you 'deployed' something
 

LiquidDeath

Magnus Deadlift the Fucktiger
4,899
11,321
What exactly is change management? To me it sounds like someone dedicated to reviewing and approving commits.
Nah, that kind of thing would be done by another developer or a dev lead if it was required.

Change Management is at a higher level than individual commits. It would need to be at least a collection of commits that would make up a new version. It is, admittedly, a purely bureaucratic function but one that exists because someone must ensure that the changes across the organization don't collide in catastrophic ways that affect the bottom line.

Of course, we're making a full organization move to Azure so it will be continuous deployments, but even in that case they balk at any suggestion of approval gates, code reviews, or even automated deployment and testing in a like for like, non-production environment.

Basically, anything remotely resembling oversight is immediately attacked as an impediment to the release cycle.
 
  • 1Like
Reactions: 1 user

AladainAF

Best Rabbit
<Gold Donor>
12,864
30,813
I need to watch some of that guy’s channel maybe. This was something I was thinking about last weekend bc my computer was churning and I noticed that Brave was taking up 98% of my RAM. Normal amount of tabs open (so you know like 280, not counting porn). One page was eating up 8 gig of RAM. The internet worked just fine on angelfire and geocities. Now with No Script and whatchamacallit it is easy to see that every page is running 48,000 scripts and redirects just to help you see some 40 year old pretending to be someone’s 18 year old step sister.

Here's the video that got me first interested in his channel.

The really good points he brings up are the first 10 minutes of the video - and at 29:00 or so, he starts discussing what he feels is the solution.

 
  • 2Like
Reactions: 1 users

swayze22

Elite
<Bronze Donator>
1,211
1,091
There's a lot of complete unoptimized shit, not just on the consumer or pro-sumer level, but even enterprise SAAS or infrastructure products.

Like holy shit, the Avaya trash that I work on for a living. This is software that ran just fine on integrated circuits in a Lucent phone switch nearly 30 years ago and yet sometimes still hangs up and crashes on the latest greatest datacenter hardware.
Avaya fucking blows, that's all I have to contribute here.
 

Deathwing

<Bronze Donator>
16,403
7,399
Here's the video that got me first interested in his channel.

The really good points he brings up are the first 10 minutes of the video - and at 29:00 or so, he starts discussing what he feels is the solution.

I'm not saying he's wrong, but he's painting a one-sided argument with rose-colored glasses. From what I gather, he's a college professor? If he wants his arguments to be more definitive, I think he should be less feels with the past.




Also, he's turned off comments. I don't know the reason why, I can guess that perhaps there's some academic use for his videos and who wants to teach a course and "ur a fgt" is in the comments. But at the same time, if he's so gungho for the past, he should leave comments on.
 

Lendarios

Trump's Staff
<Gold Donor>
19,360
-17,424
So I've become enamored with a youtube channel run by some guy who makes the case that computing and software is absolute dog shit today. He brought up many points about this, concerning things like still typing into word processors today and the text sometimes will lag, or it will take time to format properly while typing. And yet, 30 years ago to accomplish a near similar task (writing in a document) there were no issues. Likewise with things like loading programs. Photoshop in 2021 takes over 15 seconds to load up on screamingly fast hardware, but 20 years ago on 20 year old hardware, it took less time, despite it bring the same program and just less features. While they're not the same feature set, that in no way accounts for the increased time especially when hardware has increased exponentially in that time.

His idea is that there's just code on top of code on top of code, and it's a never ending cycle. The vast majority of people don't want to write their own code - but instead use someone elses libraries and code. Despite the gains in hardware being absolutely insane in the last 20 years, software and computing in general has gotten worse and worse and plagued with more and more problems. Developers make excuse after excuse, but none really want to acknowledge that this is a real problem. Problems such as webpages that wont load properly, long load times, updates happening in the middle of process flows, and many of these things are having real-world issues even though it wouldn't have happened a few decades ago.

His latest video showcases Windows Terminal, and how it's so unbelievably painfully slow, only because they simply DirectDraw everything. So when terminal is displaying a lot of text, your system is spending a vast majority of its resources performing worthless code that does nothing to accomplish the task at hand.

So anyway, point of the thread: Open discussion on this, get thoughts/opinions from other coders, and introduce to this channel that I've been following lately.

Aladain: im glad you enjoy programming videos, the following is not aimed at you.


Jesus... That guy is so fucking wrong it irritates me.


Complaining about how fast the windows terminal renders the output is next level autism for sure. Same as people who love to brag that their code is 1 pico second faster than the competition.

It is retarded to be honest. Windows terminal is NOT slow, and in no way shape or form it is detrimental to a development experience. "Oh look at it, i'm piping through it megabytes worth of text that I'm not even reading". wait what? Why are you outputting on a terminal that is intended for human to read it, data at such a speed that it is impossible for a human to read it? and THEN complaining that it is too slow. You are going to tell me with a straight face that you can read all that text at that speed?

This is the same as complaining why isn't your kitchen faucet as powerful as a firehouse connected to a fire hydrant. I really cant believe the author is serious about the speed of which windows terminal is displaying a text dump.


And then to criticize the developers as to why didn't they write all the code and rely on libraries instead is just stupid. You always have to rely on libraries written by others, because of two things, time and knowledge. No one has the time on their hands to write all the code that it would be needed in order to bypass existing libraries that give you the functionality that you want. Second, none one has the knowledge, as in brain power, to know how to code for such things.

The universe of programming is vast, and a human can only specialize on a few things, no one is an expert in all of this. There isn't a single company in the world that can afford to write all the code that they need and not rely on libraries written by others outside of the company.


Regarding word typers today being slower than word perfect, so what? Can wordperfect of 1995 do any of the million things the current word processor can? The answer is no. If all the downside for it is a minor graphical delay when pressing a keyword, then it is a tradeoff that works in favor of the modern software.


As a developer with 20 years of experience on the field. This is the best advise I can give anyone.

Your salary is paid by people, people who in turn need to make money themselves and are using your service to build/maintain and keep the money coming in.

Your code has 2 major limitations, the first one is the amount of time it takes to make something, and second is the quality of what can you make in that time period. The more time you spend on something, the better the quality of the code is. You have to balance out the time you need to do something, with the time you have to do something. And you yes, have to cut corners, and you have to let bugs be, and you have to do the most that you can with the time you have. And you do this and you tell your managers and product owners what are the issues you have and what corners you cut. You never hide anything. Even things such as, "look it works for 90% of out test cases, but 10% it doesn't work, what do we do? Do we add more time, or do we deploy as is".



I would like to promote https://www.youtube.com/user/ElfocrashDev and https://www.youtube.com/user/IAmTimCorey as development resources for anyone interested in developing. I use them a lot. Nick is advanced, Tim is beginning level, but both are technically sound
 

Lendarios

Trump's Staff
<Gold Donor>
19,360
-17,424
Also, he's turned off comments. I don't know the reason why, I can guess that perhaps there's some academic use for his videos and who wants to teach a course and "ur a fgt" is in the comments. But at the same time, if he's so gungho for the past, he should leave comments on.

Judging by his comments of "initializing a video card is fundamentally not hard". He is a quack, or whatever the programing equivalent is.

When someone uses words as such 'fundamentally" just disregard anything they say.

Here is why he is saying is just plain wrong.

Out of the top of my head there are around a couple of hundred different models of video cards being used right now in the world. This number is not an exaggeration and is probably low balling the diverse number of different models of graphic cards out there. Imagine There are 15-20 video card manufactures and integrated video manufacturers that release multiple versions per year, also due to cards being on the market for years, maybe even decades, that increases the number by a lot.


Imagine now that in order for your program to work, you have to write custom code for each and everyone of them. He wants developers to do this, instead of using the libraries that abstract that for us. So imagine for a second the inane amount of work that entails. Imagine that if you bought a video card that was released AFTER your program is made, your program crashes, because it is not coded to initialize such video card, and his solution is "Release a new version of your program to account for this."

That is the level of absurdity he is saying.