Echo: A Large Language Model with Temporal Episodic Memory


HuggingFace Logo


🛠️ Code   |   📚 Dataset   |   🧠 7B Model   |   🧠 72B Model


🚀 Welcome to Echo!

Echo is a cutting-edge large language model (LLM) designed with temporal episodic memory, enabling advanced reasoning and context retention. We invite the community to explore, use, and cite our work!


📖 Citation

If you use Echo in your research or applications, please cite us:

@misc{liu2025echolargelanguagemodel,
  title={Echo: A Large Language Model with Temporal Episodic Memory},
  author={WenTao Liu and Ruohua Zhang and Aimin Zhou and Feng Gao and JiaLi Liu},
  year={2025},
  eprint={2502.16090},
  archivePrefix={arXiv},
  primaryClass={cs.CL},
  url={https://arxiv.org/abs/2502.16090},
}

🌟 Get Involved

We welcome contributions, feedback, and collaboration from the community. Feel free to open issues or pull requests on our GitHub!


Empowering AI with memory. Echo: Remember, Reason, Respond.

Downloads last month
3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Paper for ALmonster/Echo1-7B