This is the Linux app named MLRun whose latest release can be downloaded as mlrun-demos.tar. It can be run online in the free hosting provider OnWorks for workstations.
Download and run online this app named MLRun with OnWorks for free.
请按照以下说明运行此应用程序:
- 1. 在您的 PC 中下载此应用程序。
- 2. 在我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX 中输入您想要的用户名。
- 3. 在这样的文件管理器中上传这个应用程序。
- 4. 从此网站启动OnWorks Linux online 或Windows online emulator 或MACOS online emulator。
- 5. 从您刚刚启动的 OnWorks Linux 操作系统,使用您想要的用户名转到我们的文件管理器 https://www.onworks.net/myfiles.php?username=XXXXX。
- 6. 下载应用程序,安装并运行。
SCREENSHOTS
Ad
ML运行
商品描述
MLRun is an open MLOps framework for quickly building and managing continuous ML and generative AI applications across their lifecycle. MLRun integrates into your development and CI/CD environment and automates the delivery of production data, ML pipelines, and online applications, significantly reducing engineering efforts, time to production, and computation resources. MLRun breaks the silos between data, ML, software, and DevOps/MLOps teams, enabling collaboration and fast continuous improvements. In MLRun the assets, metadata, and services (data, functions, jobs, artifacts, models, secrets, etc.) are organized into projects. Projects can be imported/exported as a whole, mapped to git repositories or IDE projects (in PyCharm, VSCode, etc.), which enables versioning, collaboration, and CI/CD. Project access can be restricted to a set of users and roles.
特性
- Rapid deployment of code to production pipelines
- Elastic scaling for batch and real-time workloads
- 可用文档
- Feature management ingestion, preparation, and monitoring
- Works anywhere in your local IDE, multi-cloud or on-prem
- 内置集成
程式语言
Python
分类
This is an application that can also be fetched from https://sourceforge.net/projects/mlrun.mirror/. It has been hosted in OnWorks in order to be run online in an easiest way from one of our free Operative Systems.