This is the Linux app named CodeLlama whose latest release can be downloaded as codellamasourcecode.tar.gz. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named CodeLlama with OnWorks for free.
请按照以下说明运行此应用程序:
- 1. 在您的 PC 中下载此应用程序。
- 2. 在我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX 中输入您想要的用户名。
- 3. 在这样的文件管理器中上传这个应用程序。
- 4. 从此网站启动OnWorks Linux online 或Windows online emulator 或MACOS online emulator。
- 5. 从您刚刚启动的 OnWorks Linux 操作系统,使用您想要的用户名转到我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX。
- 6. 下载应用程序,安装并运行。
SCREENSHOTS
Ad
CodeLlama
商品描述
Code Llama is a family of Llama-based code models optimized for programming tasks such as code generation, completion, and repair, with variants specialized for base coding, Python, and instruction following. The repo documents the sizes and capabilities (e.g., 7B, 13B, 34B) and highlights features like infilling and large input context to support real IDE workflows. It targets both general software synthesis and language-specific productivity, offering strong performance among open models at release time. Typical usage includes prompt-driven generation, function or class completion, and zero-shot adherence to natural-language instructions about code changes. The ecosystem provides multiple distributions (e.g., HF format) so developers can integrate with standard toolchains and serving stacks. As part of the broader Llama effort, Code Llama complements instruction-tuned chat models by focusing on code-centric tasks and editor integrations.
功能
- Multiple model sizes for different latency/quality needs
- Specializations for base coding, Python, and instruction following
- Infilling for middle-of-file completions
- Strong zero-shot and prompt-conditioned code synthesis
- Support for long context windows in coding workflows
- Distributions compatible with common serving toolchains
程式语言
Python
分类
This is an application that can also be fetched from https://sourceforge.net/projects/codellama.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.