IT/Software career thread: Invert binary trees for dollars.

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

TJT

Mr. Poopybutthole
<Gold Donor>
43,040
109,927
Work is paying for Github Copilot so I've spent a lot time using it the past few weeks. I really like Copilot Editor because when I am writing a bunch of code that ends up being predictable and it will very very accurately predict 15 lines at a time with the right stylization I want. I like this a lot.

Github Copilot Chat on the other hand has the same problem all AI does. You need to know exactly what you want to do and have a good idea of what the syntax would generally be in the first place. Because gay ass Copilot will 100% tell you to use method calls that don't fucking exist and if you argue with it and tell it that you cannot make this method call as if it were static it will just stop and tell you again to make a static method call that cannot be done. This is annoying so what do I do? Go right back to reading the documentation like I've done since the day I started working.

For extremely routine code it is fine though and can give you syntax examples of anything that are generally correct.
 
  • 1Like
Reactions: 1 user

TJT

Mr. Poopybutthole
<Gold Donor>
43,040
109,927
Not sure what you mean. I am not training them, the company is just paying for the service. Microsoft, Google, and the like have unbelievable amounts of data and code examples to train with. That doesn't seem to be the bottleneck.

I think AI for coding purposes is not good for new developers because it will waste your time showing you things that are both incorrect and you lack the background to know when it's wrong. So you just waste your time and bash your head against the wall without even knowing why you're bashing your head agains the wall. It's a variant on some retard just copy pasting a bunch of code from a forum or stackoverflow without understanding it at all and wondering why it doesn't work.
 
  • 2Like
  • 1Solidarity
Reactions: 2 users

Deathwing

<Bronze Donator>
16,903
7,910
My question was vague, sorry. With no proof, I suspect the true cost of AI is not being reflected in its price currently. A significant portion of the industry is propped up by venture capital that is hedging on this becoming profitable soon. On top of that, the environmental costs of the compute and electrical resources that go into LLM training.

All that for retarded stackoverflow copy-pasta seems underwhelming.
 

TJT

Mr. Poopybutthole
<Gold Donor>
43,040
109,927
In theory, if Github Copilot could parse official documentation and millions of examples and pinpoint the exact way to construct logic you are looking for this would be very valuable. But it has to understand every single retarded way it can be phrased by people.

The industry right now is just trying to get everyone on the hook for AI so they give it away. Once people become dependent on it they will start charging a lot for it. But in this context AI has not become any better than it was 2 years ago. It has exactly the same problems it did when they started.

For general answers as a robot assistant AI is great shit. For low level tech support and product support. It's fantastic but it also doesn't need to be much more advanced than it is now to keep doing that.
 
  • 2Like
Reactions: 1 users

Neranja

<Bronze Donator>
2,650
4,241
Work is paying for Github Copilot so I've spent a lot time using it the past few weeks. I really like Copilot Editor because when I am writing a bunch of code that ends up being predictable and it will very very accurately predict 15 lines at a time with the right stylization I want. I like this a lot.
I had this exact same experience--in my case in Python. It's good with boilerplate code and "the standard way of doing things", like some sort of code completion on steroids. But once you leave the well-trodden paths you can almost always expect be in hot water. Like any other AI today it hallucinates things that don't even exist.
 
  • 2Like
Reactions: 1 users