--- license: apache-2.0 language: - en metrics: - accuracy - perplexity pipeline_tag: text-generation library_name: transformers --- # SAM1 Hybrid Model ## Architecture - Transformer + CNN + RNN - Parameters: 253,748,736 (~253.7M) - 24L × 768d × 12H ## Usage ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("path/to/model") model = AutoModelForCausalLM.from_pretrained("path/to/model") prompt = "User: Hello!\nSam:" inputs = tokenizer(prompt, return_tensors="pt") outputs = model.generate(**inputs, max_length=100) print(tokenizer.decode(outputs[0])) ```