0% found this document useful (0 votes)
3 views13 pages

Tensor Factorization Example

The document discusses tensor factorization in recommendation systems, highlighting its role in delivering personalized content by predicting user preferences based on historical data. It explains the concept of tensors as multi-dimensional arrays and details the process of tensor factorization, including rank-1 models and common methods like CP and Tucker decomposition. Challenges such as high computational costs and data sparsity are also addressed.

Uploaded by

Shantam Attry
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views13 pages

Tensor Factorization Example

The document discusses tensor factorization in recommendation systems, highlighting its role in delivering personalized content by predicting user preferences based on historical data. It explains the concept of tensors as multi-dimensional arrays and details the process of tensor factorization, including rank-1 models and common methods like CP and Tucker decomposition. Challenges such as high computational costs and data sparsity are also addressed.

Uploaded by

Shantam Attry
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Title:Tensor Factorization

in Recommendation Systems

Submited By: Submitted To:


Deepankar Pant 2403030019 Dr. NIYATI AGGRAWAL
SHANTAM ATTRY 2403030001
Recommendation Systems

● Deliver personalized content (e.g., movies,


products, news).

● Examples: Netflix, Amazon, Spotify.

● Core idea: Predict user preference based on


historical data.
Tensor Definition:
• A tensor is a multi-dimensional array.
- Vector = 1D, Matrix = 2D, Tensor = 3D+
- Example: 𝒳 ∈ ℝ^{I × J × K}
A tensor is a multi-dimensional generalization of a
matrix.
Tensor Factorization decomposes a tensor into latent
factors.
a Rank-1 model means that the original matrix (or
tensor) is approximated using only one component,
which is the outer product of two vectors (in the case of
matrices) or three vectors (in the case of a 3D tensor).
Think of it as 3D or higher-order matrix
factorization!
Original Matrix (X)
We have a 2×2 matrix:

X = [[2, 3], [6, 9]]

Factorization Model
We approximate X as the product of two rank-1 matrices A and
B:
○X≈A·B
○ Where A is 2×1 and B is 1×2.
Formulating Entry Equations
● Each entry Xij = ai * bj gives us:
○ a1*b1 = 2
○ a1*b2 = 3
○ a2*b1 = 6
○ a2*b2 = 9
Deriving Relationships
● From dividing equations, we find:
○ b2/b1 = 3/2 → b2 = 1.5 · b1
○ a2/a1 = (6/b1)/(2/b1) = 3 → a2 = 3 · a1
○ This shows a scaling ambiguity: (A, B) and (cA, B/c)
are both solutions.
○ Scaling Ambiguity means that in matrix or tensor
factorization, the individual factors can be rescaled
without changing the overall product, making the
factorization not uniquely determined.
Fixing Parameters
● To get a unique solution, set a1 = 1:
○ a1 = 1
○ b1 = 2 / a1 = 2
○ b2 = 1.5 · b1 = 3
○ a2 = 3 · a1 = 3
Final Factor Matrices
● The rank-1 factorization yields:
○ A = [[1]
[3]]
○ B = [[2, 3]]
○ Verification: A · B = X
Tensor Factorization in Recommendation

● Incorporates multiple contexts:

○ Time (user × item × time)

○ Location (user × item × location)

○ Device, mood, etc.

● Improves accuracy and personalization.


Common methods: CP Decomposition, Tucker
Decomposition.
CP (CANDECOMP/PARAFAC)
Decomposition:

• Approximate a 3D tensor 𝒳 as:

∘ Outer Product:
The outer product of three vectors gives a rank-1
tensor (like a small block of data).
Why Rank-1 Factorization?
• Simplifies data into low-rank structures
• Reveals latent relationships
• Basis for recommender systems
• Handling scaling ambiguity ensures interpretability
Challenges
● High computational cost

● Data sparsity in high dimensions

● Choosing the right tensor rank

You might also like