My school's primary languages were C and C++, and a smattering of java because it was just getting popular. Certainly the low level understanding that comes from writing a lot of code in those languages is helpful.
But the imperative/procedural mindset that it drills into you leads to some really terrible application code, and it takes a lot of exposure to higher level languages to break out of that mindset. It took me years. Switching to ruby was like starting from scratch.
By all means hire a C++ programmer to write your web app. They'll be able to debug your performance issues ricky tick. But also be prepared for some heinous procedural js/ruby/php/clojure/elixer/whatever.
By all means hire a C++ programmer to write your web app.
No one is advocating that anyone write web apps with C++. There are other kinds of servers. The complaint is that the once generalist value of a Comp Sci degree is now dumbed down, and grads are missing background knowledge they once had.
But also be prepared for some heinous procedural js/ruby/php/clojure/elixer/whatever.
I think you are making a few assumptions that don't hold anymore based on your own CS education (I'm obviously making an assumption there...). I got my CS degree in 98 and there was a strong emphasis on C, C++, algorithms, and systems programming at the time because that's where the jobs were. We only had a cursory overview of other programming languages and paradigms, and no assembly - there were no jobs there. Scripting languages were for unwashed systems administrators, and no real programmer would touch them. Functional programming was a weird little academic thing with no future. OO was "if it is a noun make it a class".
There was a fairly good chance you would end up needing to write your own data structures, algorithms, sockets code and come up with a network protocol. You would run compilers, linkers, etc. Basically systems programming lined up with the job market.
Naturally after that I thought that was the "proper" way to teach CS. It worked for me. I got a jobs doing things I learned.
20 years later, I literally haven't run a compiler in years. I use libraries for data structures. I don't need to worry about allocating memory, billion dollar industries run on scripting languages. People are passing functions to functions that return functions like that's how it's always been.
I guess my point is "generalist" education needs to evolve with the industry. That means spending less time on low level details so you can spend more time on the tools, techniques and concepts used today. It isn't a "dumbing down" - it is changing the mix. You can only do so much in 4 years. What was "generalist" 20 years ago is "specialist" today, and it should be.
It isn't a "dumbing down" - it is changing the mix.
When it's leaving out background information, it's dumbing down. Programmers should at least know the basics of how indirection works. Why is it that so many interviewees with gold-pated GPAs would tell me null pointers used up no memory? Do they have the foggiest idea what happens when they add a member in a Python/Ruby program and how that differs from adding a pointer to a struct?
There's a difference between having the background information and treating everything as if it's hazy magic. It's excusable for the buyer of a car to treat the product they've bought as a magic black box. It's inexcusable when a "mechanic" or "engineer" is incapable of doing anything but treating things like magic black boxes.
Scripting languages were for unwashed systems administrators, and no real programmer would touch them.
But all of the smarter people in my program knew two or more of them.
no real programmer would touch them. Functional programming was a weird little academic thing with no future. OO was "if it is a noun make it a class".
I worked for a company that had to fight against those prejudices and low levels of knowledge to sell licenses. We sold licenses to Fortune 500 companies so they could run billion dollar businesses on a "scripting language." You know what prepared my for working there? A generalist Comp Sci education!
20 years later, I literally haven't run a compiler in years. I use libraries for data structures. I don't need to worry about allocating memory, billion dollar industries run on scripting languages.
But you are a savvy user of those libraries because you have the background knowledge. You don't usually need to worry about allocating memory, but you know what the gotchas in extreme corner cases are. And if you had to have a custom library written in C++ for your dynamic language application, you'd know how to spec that out and hire for that while looking out for the details. I had at least a foggy idea past the buzzword level when I graduated. How about the kids who are graduating nowadays?
I got my CS degree in 98
In 98 I was in grad school.
You can only do so much in 4 years. What was "generalist" 20 years ago is "specialist" today, and it should be.
Here is what I see in way too many recent grads with a 3.75 GPA. They don't know any of the background, past a handwavy level. They have misconceptions that are outright wrong. Many of them seem to spend 4-5 years doing nothing but using libraries and gluing stuff together. Hell, we learned that stuff too -- but we learned a bunch of other stuff at the same time, plus we learned what we didn't know and what to do about it. Then again, there was a contingent who only cared about learning X-Windows, because there were lots of coding jobs in X-Windows. Aren't the people who only learn particular technologies that are in the job market the moral equivalent?
Comp Sci is dumbing down to the level of consumers of magic tech. I know engineers and physicists who would have some idea of how to begin to recreate the tech they use if civilization would fall. I think a lot of Comp Sci graduates, if they wound up with nothing but machines running machine code, would qualify for Golgafrinchan Ark Fleet Ship B.
Deeply nested if/else logic, very long functions. Imperative logic that could be better written by higher level constructs like function composition, complicating code with micro-performance hacks, etc. In general inflexible code.
Of course not all of this is because I leaned C first. A lot of it was simply due to being a new programmer. But this kind of code is more prevalent in general in the C world. Just browse some opensource C projects.
I believe a lot of this stems from the "systems programming" mindset that goes along with learning C and C++. The requirements are very precise and well known, and don't change often. There is often a fairly precise "right answer" for how to do something where the "right answer" is some combination of performance metrics. Compilers are like that, file systems are like that, tcp/ip stacks are like that, etc. The programming boundaries tend to be very bright.
The "systems programming" mindset is a liability when writing business apps where a sales person can blow up every assumption and design decision and boundry in one day. The "right answer" is not clear, and not easily measured. The "right answer" has more to do with writing code that is flexible and easy to change. That is hard to measure and a totally different way of thinking.
But what you are citing here isn't a problem with learning C and C++. It's a failing of a generalist education. You might have known enough to avoid the gotchas of concurrency, but just out of school, you didn't know what you didn't know about business application development.
But the imperative/procedural mindset that it drills into you leads to some really terrible application code, and it takes a lot of exposure to higher level languages to break out of that mindset. It took me years. Switching to ruby was like starting from scratch.
By all means hire a C++ programmer to write your web app. They'll be able to debug your performance issues ricky tick. But also be prepared for some heinous procedural js/ruby/php/clojure/elixer/whatever.