SamuelBang/AesCoder-4B
SamuelBang/AesCoder-4B is a 4 billion parameter language model developed by SamuelBang, Microsoft Research Asia, Shanghai Jiao Tong University, and Peking University. It is specifically designed to enhance the aesthetic quality of LLM-generated code, particularly for visually-oriented coding tasks like webpage design. The model leverages a large-scale instruction-tuning dataset (AesCode-358K) and agentic reward feedback for joint optimization of functionality and code aesthetics, achieving performance comparable to much larger models on design benchmarks.