wxgeorge/facebook-Meta-SecAlign-70B

The wxgeorge/facebook-Meta-SecAlign-70B is a 70 billion parameter language model, created by merging Meta's Llama-3.3-70B-Instruct with Meta-SecAlign-70B using the Passthrough method. This model combines the instructional capabilities of Llama-3.3 with the security alignment features of Meta-SecAlign, making it suitable for applications requiring robust and secure language generation. It is designed for tasks where both general instruction following and adherence to security principles are critical.

Warm
Public
70B
FP8
32768
Hugging Face

No reviews yet. Be the first to review!