About Me

中文

I’m a third year undergraduate student from School of EECS, Peking University, currently enrolled in the 23rd Zhi Class.

My research focuses on large language models (LLMs) and multimodal large language models (MLLMs). I’m open to exploring any interesting topics related to LLMs and MLLMs.

Projects

  • my-tensor: A lightweight deep learning framework implemented in C++ and CUDA, supporting convolutional neural networks and validated on datasets such as MNIST. In total, the project contains 11,612 lines of C++ and Cuda code.
  • llm-evaluator: A comprehensive framework for evaluating Large Language Models (LLMs) with support for safety, capability, and refusal assessment, enabling unified evaluation of model security attacks and defenses. In total, the project contains 5,068 lines of Python code.
  • unify-llm: A unified large language model inference framework that supports multiple inference backends (API, Hugging Face, vLLM, etc.) and provides unified interfaces with cache management functionality. In total, the project contains 2,849 lines of Python code.

Awards

  • First Prize in the 15th National College Student Mathematics Competition (Non-Mathematics A Category) 2023
  • National Scholarship 2023-2024
  • Zhi-Class Scholarship 2024-2025