
InternLM2.5-7B-Chat-1M
A 7-billion parameter ultra long contextual dialogue model
- Supports 1M ultra long contextual window, suitable for processing long text tasks
- Having the optimal accuracy of models of the same magnitude in mathematical reasoning
- Upgraded tool calling capability, supporting multiple rounds of calling to complete complex tasks
- Support collecting information from hundreds of web pages for analysis and reasoning
- Local and streaming generative inference using LMDeploy and Transformers
- Compatible with vLLM, capable of launching services compatible with OpenAI API
Product Details
InternLM2.5-7B-Chat-1M is an open-source dialogue model with 7 billion parameters, which has excellent reasoning ability and surpasses models of the same magnitude in mathematical reasoning. This model supports 1M ultra long contextual windows and can handle long text tasks such as LongBench. In addition, it also has powerful tool calling capabilities and can collect information from hundreds of web pages for analysis and reasoning.