Model Repositories

TIR Model Repositories are storage buckets to store model weights and config files. A model may be backed by E2E Object Storage or a PVC storage in kubernetes.

The concept of model is loosely defined. There is no hard structure, framework or format that you must adhere to. Rather, you can think of a model as a simple directory hosted on either EOS or a disk. This further opens up possibilities of versioning through directory structure. You may define a sub-folders like v1, v2 etc to track different model versions.

When you define a model in a project, every team member has access to it. This enables re-use and collaboration among team members. So we recommend defining a TIR model to store, use and share your model weights.

Uploading weights to TIR Models

When you create a new model, TIR automatically creates an EOS bucket as a storage house for your model weights and configuration files(e.g. tsconfig file in torchserve). You can find the connection details to access EOS bucket.

Note

If you have not used EOS (E2E Object Storage) before, read here

EOS offers a s3-compatible API to upload or download content. So you can use any s3-compatible CLI like mc (by Minio) or s3cmd alike.

We recommend using Minio CLI (mc). In TIR Models Setups section, you will find ready to use commands to configure the client and upload contents.

A typical command to setup the CLI would look like this. Minio has a concept of alias which represents a connection profile. We use the model-name as alias (or connection profile)

mc config host add <tir-model-name> https://objectstore.e2enetworks.net <access-key> <secret-key>

Once you setup the alias (or connection profile), you can start uploading content using a commands like these:

# upload contents of saved-model directory to llma-7b-23233 (EOS Bucket).
mc cp -r /user/jovyan/saved-model/* <tir-model-name>/llma-7b-23233

# upload contents of saved-model to llma-7b-23233(bucket)/v1(folder)
mc cp -r /user/jovyan/saved-model/* <tir-model-name>/llma-7b-23233/v1

Note

We recommend uploading model weights and config files such that they can be easily downloaded and used by TIR notebooks or inference service pods.

For huggingface models, the entire snapshot folder (under .cache/huggingface/hub/<model-name>/) needs to be uploaded to the model bucket.

When this is done correctly, you will be able download the weights (and configs) on any inference service pod or TIR notebook and load the model with the AutoModelForCausalLM.from_pretrained() call.

Downloading weights from TIR models

The model weights would be needed on the device whether you are fine-tuning or serving the inference requests through API.

To download the contents of TIR models manually: .. code:

# download contents of saved-model directory to llma-7b-23233 (EOS Bucket).
mc cp -r <tir-model-name>/llma-7b-23233 /user/jovyan/download-model/*

# download contents of saved-model to llma-7b-23233(bucket)/v1(folder)
mc cp -r <tir-model-name>/llma-7b-23233/v1 /user/jovyan/download-model/*

Typical use cases for downloading content from TIR models: * Downloads to local device for fine-tuning. You can install and use mc command to download the model files. * Downloads to TIR Notebook for fine-tuning or running inference tests. You can use mc command provided in TIR notebook to download the model files. * Downloads to Inference Service (Model Endpoints). Once you attach model to an endpoint, the model files will be automatically downloaded to the container.

How to create Model Repositories in TIR dashboard?

To create Model Repositories, go to the ‘Inference’ section select Model Repository and then click the CREATE REPOSITORY button.

../_images/modelrepository1.png

To create Model Repository you have to select Model type like Pytorch or Triton or Custom.

../_images/modeltype.png

After that select Storage Type New EOS Bucket or Existing EOS Bucket or External EOS Bucket then click on Create button.

../_images/modelstorage.png

1. New EOS Bucket

If you select storage type New EOS Bucket.

../_images/modelneweoscreate.png

After clicking on Create button Configure EOS Bucket to upload model weights will appear on the next screen.

Using SDk

In Using SDK tab, you will get the command for install SDk and Run appropriate commands in python shell or jupyter notebook.

../_images/newusingsdk.png

Using TIR Notebook

../_images/newusingtirnotebook.png

Using CLI

In Using CLI you will get the command to setup minio CLI

../_images/newusingCLI.png

In Using CLI you will get the command to setup s3cmd.

../_images/news3cmd.png

After successfully creation of Model Inference ,Model is shown in the list and we can see the Setup and Overview of the Model.

../_images/setupTab.png ../_images/OverviewTab.png

2. Existing EOS Bucket

If the user selects the Storage Type as Existing EOS Bucket then you have an options to select existing bucket.

../_images/existingeosbucket2.png

After clicking on Create button Configure EOS Bucket to upload model weights will appear on the next screen.

Using SDk

In Using SDK tab, you will get the command for install SDk and Run appropriate commands in python shell or jupyter notebook.

../_images/newusingsdk.png

Using TIR Notebook

../_images/newusingtirnotebook.png

Using CLI

In Using CLI you will get the command to setup minio CLI

../_images/newusingCLI.png

In Using CLI you will get the command to setup s3cmd.

../_images/news3cmd.png

After successfully creation of Model Inference ,Model is shown in the list and we can see the Setup and Overview of the Model.

../_images/setupTab.png ../_images/OverviewTab.png

3. External EOS Bucket

  • If you select Storage Type as External EOS Bucket, you can use EOS bucket (not owned by you) to store model artifacts.

  • Bucket Name , Access Key and Secret Key will be required for accessing the bucket to store model artifacts.

  • Please make sure that both Access Key secret and Secret Key are attached to the mentioned bucket.

../_images/externaleos.png

After clicking on Create button Configure EOS Bucket to upload model weights will appear on the next screen.

Using SDk

In Using SDK tab, you will get the command for install SDk and Run appropriate commands in python shell or jupyter notebook.

../_images/newusingsdk.png

Using TIR Notebook

../_images/newusingtirnotebook.png

Using CLI

In Using CLI you will get the command to setup minio CLI

../_images/newusingCLI.png

In Using CLI you will get the command to setup s3cmd.

../_images/news3cmd.png

After successfully creation of Model Inference ,Model is shown in the list and we can see the Setup and Overview of the Model.

../_images/setupTab.png ../_images/OverviewTab.png

Delete Model Repository

To delete Model Repository, select the particular model then click on delete icon.

../_images/deletemodel1.png

After that a popup will open then click on delete button.

../_images/deletemodel2.png