I seriously don't get the appeal of R for "big data"... it's single-thread limitation is just a gigantic bottle neck when your data is actually "big".
I remember I had to write some functions in R to do an analysis over 1.7TB of data, and it was just insanely slow. I re-did it later using parallelism in Clojure and it was 3 orders of magnitude faster. Like, I wrote the R, then seeing the estimated competition time, I re-implemented some of the libraries I used in Clojure, then re-wrote the functions, then executed it and got the result, before the R code finished.
I understand that the syntax and tools are a big part of R, but a lot of languages have tools of similar quality without the big drawback. If you're dealing with data in the size of GB then it's not gonna really matter, but for actual "big data"(and 1.7TB isn't even really big...) it becomes a major problem.