Sales is the lowest risk job tech related skill I know of; it works even if we were hit by an apocalypse and had no electricity. In all likelihood, we'll still be making tools and selling those tools. It helps you to think in the problem/business domain too. So you're not simply learning PHP to build a website, you're doing it to say, help people find hotels.
It works on your resume, it works for job searching, it helps with getting promotions (proving yourself useful), and it works with landing a higher salary without negotiation. You still can't sell crap. So you need a good product (i.e. yourself, your skill) worth selling. But it complements any other skill you have.
There's also communication skill in general. How to refine ideas into denser, cleaner formats that people and AI can understand. This works for sales, and it works for code.
Tech-wise, I can't say. The nature of tech is that it's very high risk. If you've had a look at how GPT-3 is engineered [1], it makes you question whether algorithms and OO will be a useful skill in the future. Sam Altman expects us to hit AGI within 2025 [2]. He's probably being a little optimistic, like every other programmer, so let's double the estimate and say we have until 2030. Codex itself scored #96 when pitted against 9000+ humans in a coding challenge [3]. So whatever you pick, it should be fairly AI-proof.
Data will be around, and anything that deals with data will be helpful. Even if you could tell AI to do whatever, it has to pull data from somewhere. Spreadsheets are great. Databases will be around for a long time. The top 3 most used DBs or so use some variation of SQL.
There will also need to be some kind of front end for data for people (and even AI) to use. Low/no-code has been around forever but there's always a domain it can't solve. Something specialist like Shopify, Magento, WordPress that solves a problem millions of people have. If you want something that combos well with higher risk work, you can learn UI/UX.
Again, low risk, low returns. The absolute lowest risk is food. Everyone needs to eat. Farming and cooking will keep you from starving, but probably won't take you much higher than that.
Projective geometry and the perspective projection matrix (extrinsic and intrinsic parameters of your modeled camera) is basically the "mathified" version of that. Or just "perspective drawing", as artists used it for 100+ years by now.
In my opinion, computer graphics is basically applied math (or linear algebra). (You even don't need linear algebra or matrices to render stuff on the screen, but it will be painful to keep track of what's going on.)
Math hardly dies off, it seems. Maybe some methods change. Like calculating results by using some geometry and measuring its length.
However, unless we don't have any need for projecting something from 3D (world/scene) to 2D (image, monitor, photo, photosensitive sheet, ...), I think we can count on it for a long time.
What changes might be some algorithms. Earlier, we used the "edge walking" algorithm to fill out triangles. Now, we use edge equations that tell you whether an image point is inside a triangle or not.
But they do essentially the same, namely, filling out a triangle.
Also, things like the Phong model may stick as well.
There are other methodologies, that are basically still based off of the pinhole camera model.
In ray tracing, you shoot rays into the scene and check whether they intersect an object. If it intersects, then you color that pixel with the intersected object's defined color. Otherwise, you leave it out and move on pixel by pixel.
So the shooting of the rays from the camera to the scene is basically the reversal of the pinhole camera where light enters the hole and "colors" a photosensitive sheet of paper.
At least this is my understanding of it.
Just learn math, math, math, and you will be fine. Math is the lingua franca for engineering and science, so learn it and understand the concepts from those other fields/branches.
So the syscalls of a Linux/Unix machine are the same, b/c of the POSIX API. The POSIX API is a standard for *nix OSes.
Now, we have compilers such as gcc, clang and Microsoft's C++ compiler.
Do they decide on their own ABI (specification of how things should be implemented in the lowest levels) then?:
So then we have 2 APIs. That is, we have the C standard library (as an API) and the POSIX API.
The POSIX API defines the syscalls such as write, read, open, ...
The C or C++ standard library provide the header files (function declarations etc.) and its .so file (the shared library) is in the memory.
Take the Microsoft C++ Compiler as an example. I cannot see how the standard library is implemented, because only the headers are defined and the actual compiled code "lives" as a .so file in memory. This .so file as an object in memory is accessible to all C/C++ applications. The .so file is essentially C's runtime environment (RTE).
(Also, C and C++ are standardized so no matter who implements the compiler and the standard library, it has to follow the language standard.)
The standard library among other things is not only providing a wrapper for syscalls such as malloc and printf (POSIX API), it is also providing some useful functions or algorithms such as qsort, std::transform... Furthermore, data structure or containers can also be part of a standard library: std::vector, std::unordered_map, ...
Now, if we compile a C or C++ program, the ABI is basically a set of definitions/rules, that the compiler has to abide to.
Meaning things like how function parameters are stored (stacked or in registers), how a function should be called, how arguments are passed, how operations are mapped to machine code etc.
But not only is an ABI a mapping or layout between C instructions and machine code, it is also a mapping between OS syscalls and C or C++ code.
Exception handling in C and POSIX is basically this:
#include <stdio.h>
#include <setjmp.h>
int main()
{
jmp_buf env;
if (!setjmp(env)) /* try - something might go wrong in the subsequent code block (try block) better bookmark this line */
longjmp(env, 1); /* throw an exception trigger the catch (else) block by going back to that bookmarked line */
else /* catch */
fprintf(stderr, "Yikes, something went wrong! ;)\n");
return 0;
}
So if I do exception handling in C++, then the ABI of C++ should follow an exception handling routine. For example, virtual functions in C are basically this:
So vtables are structs of function pointers. The C++ compiler has to follow some ABI convention (namely, some rule how to map or translate a vtable implementation such that it corresponds to the C++ code.)
- syscalls ~ POSIX API (operating systems API) ~ write, read, open, ...
- C/C++ standard library (.so files loaded in computer memory) ~ wrapper for syscalls (printf, malloc, ...), containers/data structures/algorithms (std::vector, std::unordered_map, qsort, ...)
-ABI ~ rules for the compiler that tell how vtables, function calls, exceptions should be implemented
And always use "-fwrapv -fno-strict-aliasing -fno-delete-null-pointer-checks". Those are source of the hardest to reason about UB and the small performance benefit is not worth the risk. I would love to always be certain I haven't accidentally hit UB, but that's equal to the halting problem, so for peace of mind, I just ask the compiler for a slightly safer variant of the language.
P.S. and old code definitely needs them, as at the time compilers didn't optimize so aggressively and lots of code does weird stuff with memory, shifts, etc.
A reference is functionally equivalent to a const pointer. (Reference reassignment is disallowed. Likewise, you cannot reassign a const pointer. A const pointer is meant to keep its pointee [address].)
The difference between them is that C++ const references also allow non-lvalue arguments (temporaries).
It is much easier to read from right to left when decoding types. Look for yourself:
- double (* const convert_to_deg)(double const x) // const pointer to function taking a const double and returning double
- int const (* ptr_to_arr)[42]; // pointer to array of 42 const ints
- int const * arr_of_ptrs[42]; // array of 42 pointers to const ints
Thank you! :)