Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

freq_var is not saved in save_weights(), which causes restrict() does not work on loaded models #385

Open
huangenyan opened this issue Feb 27, 2024 · 2 comments
Assignees

Comments

@huangenyan
Copy link

Code to reproduce the issue

import tensorflow as tf
import tensorflow_recommenders as tfra
import tensorflow_recommenders_addons.dynamic_embedding as de

def build_model():
    embedding = de.keras.layers.Embedding(
        embedding_size=8,
        init_capacity=1000,
        restrict_policy=de.FrequencyRestrictPolicy,
        name='UserDynamicEmbeddingLayer',
    )
    return tf.keras.Sequential([
        embedding,
        tf.keras.layers.Dense(8, activation='relu'),
        tf.keras.layers.Dense(4),
        tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1)),
    ])

model = build_model()
model.compile(
    optimizer=de.DynamicEmbeddingOptimizer(tf.keras.optimizers.Adam()),
    loss=tf.keras.losses.BinaryCrossentropy(),
)

x_tensors = tf.convert_to_tensor([1, 2, 3, 4, 5], dtype=tf.int64)
y_tensors = tf.convert_to_tensor([1, 1, 1, 0, 0], dtype=tf.float32)
ds = tf.data.Dataset.from_tensor_slices((x_tensor, y_tensor)).batch(5)

model.fit(ds)

model.save_weights('test_model')

if we use the model in another file:

import tensorflow as tf
import tensorflow_recommenders as tfra
import tensorflow_recommenders_addons.dynamic_embedding as de

def build_model():
    embedding = de.keras.layers.Embedding(
        embedding_size=8,
        init_capacity=1000,
        restrict_policy=de.FrequencyRestrictPolicy,
        name='UserDynamicEmbeddingLayer',
    )
    return tf.keras.Sequential([
        embedding,
        tf.keras.layers.Dense(8, activation='relu'),
        tf.keras.layers.Dense(4),
        tf.keras.layers.Lambda(lambda x: tf.math.l2_normalize(x, axis=1)),
    ])

model = build_model()
model.compile(
    optimizer=de.DynamicEmbeddingOptimizer(tf.keras.optimizers.Adam()),
    loss=tf.keras.losses.BinaryCrossentropy(),
)

model.load_weights('test_model')
embedding = model.get_layer(index=0)
print(embedding.params.size())
print(embedding.params.restrict_policy.freq_var.size())
embedding.params.restrict(3) # no effect
print(embedding.params.size())

You'll find the size of freq_var is 0, and calling restrict() has no effect.

Is there a way to save freq_var so I can restirct the embedding size in a future training? This is a very common scenario in daily incremental training.

@rhdong
Copy link
Member

rhdong commented Feb 28, 2024

Hi @MoFHeka, could you have time to help with it? Thank you!

@huangenyan
Copy link
Author

Hi any updates on this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants