The Cls
class is a wrapper around your local Python classes. It can be sent to and live remotely on your compute,
then be called natively in Python from your local environment, while being run on remote compute.
Builds an instance of Cls
.
class_obj (Cls, optional) – The class to be executed remotely. If not provided and name is specified, will reload an existing cls object.
name (str, optional) – The name to give the remote class. If not provided, will use the class’s name.
get_if_exists (bool, optional) –
Controls how service lookup is performed when loading by name.
If True (default): Attempt to find an existing service using a standard fallback order (e.g., username, git branch, then prod).
If False: Only look for an exact name match; do not attempt any fallback.
This allows you to control whether and how the loader should fall back to alternate versions of a service (such as QA, prod, or CI versions) if the exact name is not found.
reload_prefixes (Union[str, List[str]], optional) – A list of prefixes to use when reloading the class (e.g., [“qa”, “prod”, “git-branch-name”]). If not provided, will use the current username, git branch, and prod.
Example:
import kubetorch as kt remote_cls = kt.cls(MyClass, name="my-class").to(kt.Compute(cpus=".1")) result = remote_cls.my_method(1, 2)
- __init__(name: str, pointers: tuple | None = None, init_args: dict | None = None)
Initialize a Cls object for remote class execution.
Note
To create a Cls, please use the factory method
cls()
.
- Parameters:
name (str) – The name of the class to be executed remotely.
pointers (tuple, optional) – A tuple containing pointers/references needed to locate and execute the class, typically including module path and class name.
init_args (dict, optional) – Dictionary of arguments to pass to the class constructor. Defaults to None.
Whether to run the function or class methods in async mode.
Endpoint for the module.
Compute object corresponding to the module.
Helper method to deploy modules specified by the @compute decorator. Used by kt deploy CLI command. Deploys the module to the specified compute.
Async helper method to deploy modules specified by the @compute decorator. Used by kt deploy CLI command when multiple modules are present. Deploys the module to the specified compute asynchronously.
Reload an existing callable by its service name.
Name of the function or class.
Namespace where the service is deployed.
Default serialization format for this module.
Knative service configuration loaded from Kubernetes API.
Name of the knative service, formatted according to k8s regex rules.
Delete the service and all associated resources.
Send the function or class to the specified compute.
compute (Compute) – The compute to send the function or class to.
init_args (Dict, optional) – Initialization arguments, which may be relevant for a class.
stream_logs (bool, optional) – Whether to stream logs during service launch. If None, uses the global config value.
verbosity (Union[verbosity, str], optional) – Verbosity of the logs streamed back to the client. If not specified, will stream select service logs. Can also be controlled globally via the config value log_verbosity. Supported values: “debug”, “info”, “critical”.
get_if_exists (Union[bool, List[str]], optional) –
Controls how service lookup is performed to determine whether to send the service to the compute.
If False (default): Do not attempt to reload the service.
If True: Attempt to find an existing service using a standard fallback order (e.g., username, git branch, then prod). If found, re-use that existing service.
reload_prefixes (Union[str, List[str]], optional) – A list of prefixes to use when reloading the function (e.g., [“qa”, “prod”, “git-branch-name”]). If not provided, will use the current username, git branch, and prod.
dryrun (bool, optional) – Whether to setup and return the object as a dryrun (True
),
or to actually launch the compute and service (False
).
The module instance.
Module
Example:
import kubetorch as kt remote_cls = kt.cls(SlowNumpyArray, name=name).to( kt.Compute(cpus=".1"), init_args={"size": 10}, stream_logs=True )
Async version of the .to method. Send the function or class to the specified compute asynchronously.
compute (Compute) – The compute to send the function or class to.
init_args (Dict, optional) – Initialization arguments, which may be relevant for a class.
stream_logs (bool, optional) – Whether to stream logs during service launch. If None, uses the global config value.
verbosity (Union[verbosity, str], optional) – Verbosity of the logs streamed back to the client. If not specified, will stream select service logs. Can also be controlled globally via the config value log_verbosity. Supported values: “debug”, “info”, “critical”.
get_if_exists (Union[bool, List[str]], optional) –
Controls how service lookup is performed to determine whether to send the service to the compute.
If False (default): Do not attempt to reload the service.
If True: Attempt to find an existing service using a standard fallback order (e.g., username, git branch, then prod). If found, re-use that existing service.
reload_prefixes (Union[str, List[str]], optional) – A list of prefixes to use when reloading the function (e.g., [“qa”, “prod”, “git-branch-name”]). If not provided, will use the current username, git branch, and prod.
dryrun (bool, optional) – Whether to setup and return the object as a dryrun (True
),
or to actually launch the compute and service (False
).
The module instance.
Module
Example:
import kubetorch as kt remote_cls = await kt.cls(SlowNumpyArray, name=name).to_async( kt.Compute(cpus=".1"), init_args={"size": 10}, stream_logs=True )