what's the big deal with parallelism?
I was in a computer architecture class last semester, and the last chapter was about all this multiprocessor stuff. It's quite confusing and I didn't really learn it that well.
What's the big deal about it? Do multiple cores really provide the speedup they promise? Is task level parallelism a good idea, or do the separate processes need to like communicate with each other a lot?
In that textbook I also read the section about GPUs. It was mostly nvidia buzzwords and didn't explain anything very well. Sounds like a lot or proprietary voodoo that only the top maybe five computer architecture people in the world really understand. Sounds like nvidia is also way ahead of any other cpu/gpu maker, including intel amd or even the government maybe lol if they do that sort of thing.
And how does vector processing play into things? How many applications really rely on vector processing? I know a lot of this is geared at graphics and sometimes sound, but do they have much benefit to normal processing? Will it change how programming is done like in a serious way?
I guess this is kind of like the change from learning about calculus in a scalar way (Calc 1 and early Calc 2) to transitioning to a vector way of thinking (vector calculus). Which still is more complicated, I understand calculus in a scalar perspective, but I forgot a lot of those vector calc formulas, and how they relate to a scalar math world. I wish math the way it's taught was more generalized to account for vector and matrix (arbitrary R^n).