Well, I must say, everybody else in the computer world calls "caching…" to "Cache"…
I can testify that I am familiar with the term "Memorization" at least in the last 14 years since I have learned about it in high school… (Ouch! I'm old…). I used it often over the years, including with programmers in the industry and "everybody else in the computer world".
There are certain differences between memorization and cache.
Although some of the differences are only semantic, there is still a difference between them because of different uses.
First of all: Memorization is implemented in the algorithm itself and is most often software based, whereas Cache and buffers are by definition hardware storage.
"Memorization" is what we do with functions – Remembering the values of previous calls so the next one will (hopefully) run faster, usually without complicated algorithms behind the "how to remember". Memorization is the mechanism often used to improve the time complexity of algorithms in expense on space complicity.
The term "Cache" is usually applied to the mechanism used to accelerate reads from slow memory. The Cache is a very fast storage space (Cache speed is much greater then normal memory speed) in which we store values from a slower storage space in order for the next read of the SAME value (or block of data) will be done faster. Cache is used to accelerate the speed of future accesses for the data from the slow memory. i.e. – The data is there somewhere and I want it to be closer to (for example) the CPU so the overall accessing speed to that data by the CUP will be done faster. Behind the "Cache mechanism" there is whole world of logic of how to pull off data, what to pull, when, how and so on.
And "Buffer" (which you didn’t mention) for that matter, is the mechanism often used in order to "accelerate" access to a very very slow storage sites or communication ports that are much slower from the speed of preparing the data for storage/transmission. i.e. – The CPU pushes data to the buffer at high rate and the buffer is then responsible of sending the data onward at the appropriate speed (in that way the CPU is not slowed down by the external slow ports). Buffer is sometimes used to describe the temporary storing of data when sending this data from one program part to the other or between different hardware parts.
To conclude: The terms are similar but different. More information is available (guess were…) in Wikipedia.
And hey, at least say thanks to the CS1001 staff who managed to introduce you with a term you did not know about ;)