IT/Software career thread: Invert binary trees for dollars.

  • Guest, it's time once again for the massively important and exciting FoH Asshat Tournament!



    Go here and give us your nominations!
    Who's been the biggest Asshat in the last year? Give us your worst ones!

ShakyJake

<Donor>
7,963
20,076
Anyone familiar with dependency injection?
Yes, although it's really not that complicated. Basically the concept is that you are not tying your objects to a concrete implementation of any of their dependencies.

A simple example:

In this example we are tying our AccountsController to a concrete version of the AccountsRepository (which are instantiating it in the Controller's constructor). The repository class may be tied directly to database access. Situations like this makes testing difficult because, in order to test, we have to have a functional database with data, etc. Additionally, in the future, if we decide to change the AccountsRepository to a new implemention - like AccountsRepostioryII or something, we would need to go back and change our Controller class to use this new version.

However, with DI we do this:

Here, we areinjectingthe repository object into the controller's constructor. The class now has no knowledge of what the instantiation details of the IRepository interface. In this instance, testing is easy because the IRepository object can be a mock object that has whatever data we may need for testing. And we can easily change the concrete implementation of IRepository without having any need to change the controller class.

Make sense?
 

Tenks

Bronze Knight of the Realm
14,163
607
Anyone familiar with dependency injection? I keep seeing references to it, links to Dagger and Dagger 2 and explanations that go deep into coffee maker metaphors but it's light on "this is your code without DI, and this is your code with DI" examples.
Like Spring? They make dependency injection (often times also called inversion of control) more complicated than it actually needs to be. In general DI is really good for giving you the variables you need to use. Often times it takes over what would be a Factory pattern. Like lets say we have an interface and three implementations. Lets say these are currency classes. If we start up in USA we want USD, if we start in UK we want Pounds and if we start in Canada we want CAD. With DI you can use the framework to just inject the implementation you need without your code needing to know what the heck is going on or how to create the concrete class it requires. It just knows it needs to do Currency.doFunction(). It does clean up the code and is pretty widely used.
 

moontayle

Golden Squire
4,302
165
I've done it some instances without knowing that's what it was (go go Junior Dev experience), but it's one of those things being touted by experienced Android Devs as something they use via Dagger or Dagger 2 libraries. That and View Injection to cut down on the heavy boilerplate code when it comes to view elements (sooo much boilerplate). Once I get the first iteration of this project done I plan to look into how I can utilize these things to both get experience with using it and making the overall code better.

The other thing I keep hearing about is the use of RxJava (reactive extensions) but apparently that's like Dwarf Fortress level learning curve and I'm barely comfortable with what I do now so that's for the future.
 

Vinen

God is dead
2,791
497
They've yet to C why they need the D. Everyone needs the D.

But really, C is probably not going to go anywhere in our lifetimes because the truth is that for x86 architecture C is basically it in terms of performance. C was so well designed out that gate that there's no reason to use anything else if you're looking for raw performance, you can add more features, but in terms of hypothetical speed you can't make anything significantly better than C(it's possible, but we're talking low single digit differences here). So the community as a whole sees no real reason to re-invent the wheel, when they do, we get things like D, Rust, etc, which claim to be very fast with essentially just more features or different paradigms, but they're still essentially C.

I expect this to change once X86 architecture starts dying in the next 15yrs~, because C is not optimal for ARM assembly, and I predict that either ARM will over-take x86 in the future or Quantum computing will make this all pointless. Whichever gets there first!
Has there been an ARM processor worth a shit yet in terms of raw performance? I haven't been following it but last I remember ARM was primary for embedded devices.
Once I see companies pivoting towards an ARM based virtualization I will agree.

The world is not going back to bare metal deployments in Data Centers.
 

Noodleface

A Mod Real Quick
38,360
16,251
ARM is a long way away from anything other than embedded - and you can bet that Intel is working on stomping out that fire. There are so many goddamn vendors for ARM processors it's nuts, and they all have varying technologies, PCIE lanes, features, ports, etc. The vendors also seem to either stop or close up shop seemingly overnight. The challenge here will be finding a unified approach to ARM.

The first non-embedded things you'll see ARM used for are things where a SOC is ideal. Maybe dense disk arrays where you need a small processor to handle it. Even then, meh.
 

Vinen

God is dead
2,791
497
Surfaces (old ones, at least) used ARM processors too. But those are weak
The RT model which failed used them. All Microsoft Surfaces now use x86 to my knowledge.

Chromebooks aren't exactly successful as well. I've personally not met one person with a Chromebook.
 

Tripamang

Naxxramas 1.0 Raider
5,538
34,318
I seriously don't get the appeal of R for "big data"... it's single-thread limitation is just a gigantic bottle neck when your data is actually "big".

I remember I had to write some functions in R to do an analysis over 1.7TB of data, and it was just insanely slow. I re-did it later using parallelism in Clojure and it was 3 orders of magnitude faster. Like, I wrote the R, then seeing the estimated competition time, I re-implemented some of the libraries I used in Clojure, then re-wrote the functions, then executed it and got the result, before the R code finished.

I understand that the syntax and tools are a big part of R, but a lot of languages have tools of similar quality without the big drawback. If you're dealing with data in the size of GB then it's not gonna really matter, but for actual "big data"(and 1.7TB isn't even really big...) it becomes a major problem.

Edit: I know roughly last year? things like parallel and other libraries for R came out that have parallel versions of functions, but R at the language level is still inheritly single-threaded without using C/C++ basically. I hate writing R code even though I'm forced to frequently.
I don't know shit about R, but is it possible to split the datasets up into smaller batches to run many parallel jobs against smaller datasets then recombine the results in the end?
 

Kovaks

Mr. Poopybutthole
2,358
3,147
Yeah I see a lot of them there, and seeing more and more in college as well. Not pretending they're popular, but clearly they're hitting a market segment well.
My guess is that The more software as a streaming service gets refined the more this segment will grow. When you can reliably get things like nvidia grid or adobe's planned steaming photoshop then all of a sudden all you need is a crap cheep computer and a good Internet connection
 

Asshat wormie

2023 Asshat Award Winner
<Gold Donor>
16,820
30,968
I don't know shit about R, but is it possible to split the datasets up into smaller batches to run many parallel jobs against smaller datasets then recombine the results in the end?
Plyr package probably the way to go with such a thing.
 

Vinen

God is dead
2,791
497
My guess is that The more software as a streaming service gets refined the more this segment will grow. When you can reliably get things like nvidia grid or adobe's planned steaming photoshop then all of a sudden all you need is a crap cheep computer and a good Internet connection
Good internet connection not happening anytime in the US for this. If we were Korea I could see the US driving this. We are too spread out for this to ever go mainstream.
 

Noodleface

A Mod Real Quick
38,360
16,251
I don't see thin clients being the norm for a long time, I'll always be a hardware enthusiast anyways. Hell I'm excited to get the latest processors at work and they barely change what we have to do. I just like the shinies
 

Noodleface

A Mod Real Quick
38,360
16,251
Just need to vent for a minute

We re-orged some people, and since my team was very slim we took on a principal hardware engineer (pretty high up) as a new software engineer in our group - he's an older guy with kids in their 20's. Our code is extremely low level, as you might imagine, so a lot of the coding concepts you guys talk about don't really apply here - it's firmware. We write in C and follow the UEFI spec, nothing groundbreaking with the exception of a few very technical pieces of code. This guy has been around awhile, so C isn't an issue for him and since he's been working on hardware forever he knows his way around.

Me and another guy are sort of mentoring him so that he follows good coding practices and debugging practices. The other guy is on vacation so it's really just me. But this hardware dude is at my desk every 10 minutes asking me questions.

It started off innocently enough with small questions but now it's all the time asking me about our code. Like "why do we do this here" and "This code doesn't make sense, can you explain it?"

Just amazes me that someone could get so far in their career and not know how to "dig right into it." I'm not a genius or anything, but I try to reserve my questions for pieces of code/technology that I really have absolutely no clue about. I take pride in being fairly self-sufficient in terms of digging into our codebase, but this guy is just asking a million questions.

I even try to lead him with breadcrumbs to what he needs to see. I'll say things like "hmm I think I saw some code in somefile.c, maybe look at that" and 5 minutes later he'll come back and say "nope wasn't there" or ask another leading question about it.

Just blows my mind.

He also got a little irritated that I had Funco POP toys on the top of my desk (GOT and Breaking Bad), he said he didn't understand why a grown man would have toys. Got a little offended.
 

moontayle

Golden Squire
4,302
165
That last part is fightin' words. Find a nearby bluehair and point xim in his direction.

I take pride in being fairly self-sufficient in terms of digging into our codebase, but this guy is just asking a million questions.
With you there. I kind of have to be since I have no one to go to, but even in the general sense I'll use my Google-Fu, SO and the like to dig into something, experiment with code branched off master before I even get to the point where I feel like I'm lost. It may take me a bit to get where I'm going, but I get there and I learn a lot in the process.
 

Cad

scientia potentia est
<Bronze Donator>
25,824
50,693
Just need to vent for a minute

We re-orged some people, and since my team was very slim we took on a principal hardware engineer (pretty high up) as a new software engineer in our group - he's an older guy with kids in their 20's. Our code is extremely low level, as you might imagine, so a lot of the coding concepts you guys talk about don't really apply here - it's firmware. We write in C and follow the UEFI spec, nothing groundbreaking with the exception of a few very technical pieces of code. This guy has been around awhile, so C isn't an issue for him and since he's been working on hardware forever he knows his way around.

Me and another guy are sort of mentoring him so that he follows good coding practices and debugging practices. The other guy is on vacation so it's really just me. But this hardware dude is at my desk every 10 minutes asking me questions.

It started off innocently enough with small questions but now it's all the time asking me about our code. Like "why do we do this here" and "This code doesn't make sense, can you explain it?"

Just amazes me that someone could get so far in their career and not know how to "dig right into it." I'm not a genius or anything, but I try to reserve my questions for pieces of code/technology that I really have absolutely no clue about. I take pride in being fairly self-sufficient in terms of digging into our codebase, but this guy is just asking a million questions.

I even try to lead him with breadcrumbs to what he needs to see. I'll say things like "hmm I think I saw some code in somefile.c, maybe look at that" and 5 minutes later he'll come back and say "nope wasn't there" or ask another leading question about it.

Just blows my mind.

He also got a little irritated that I had Funco POP toys on the top of my desk (GOT and Breaking Bad), he said he didn't understand why a grown man would have toys. Got a little offended.
Hows it feel to have useless annoying guy dumped on your team?
smile.png