We identified varying levels of ownership—the degree of control a user has in the elements of an AI system—as an additional dimension of openness in AI. The “open AI system” is comprised of three parts: the model, the host machine, and the AI stack. The following table presents a decreasing order of transparency across these three parts:

ModelHost machineAI StackContribution barrierTypes of applications the configurations tend to be used in
Open modelLocal machineOpen stack
Open modelCloud computeOpen stack
Open modelModel routing platformOpen stack
Open modelModel routing platformClosed stack
Closed modelService provider (through API)Open stack
Closed modelService provider (through proprietary interface)Closed stack
  • Local machine: hardware that the user permanently owns in physical form
  • Cloud compute: services that lend computers for a fixed amount of time like Vast.ai
  • Model routing platform: services that enable inferencing multiple models through one API like OpenRouter
  • Service provider (through API): services that enable inferencing the provider’s in-house models through a proprietary API, like OpenAI’s API

Even though cloud compute, model routing platforms, APIs, and proprietary interfaces all could be categorized as LLMs being served from the cloud, their host system come with varying degrees of abstractions in providing access to the models. While abstraction has the advantage of convenience, it can also come at the cost of transparency—the extent to which the user knows the mechanisms by which a model is served to them.

Across the six different levels of transparency, there are three levers of control. One can choose to use either a closed or an open stack, a computer or an API, and an open model or closed model. We will explore our observations across these three levers of control below.

The first lever of control is using a closed or open stack. By choosing an open stack over a closed stack, users get control over the system in which their models get used, in a model- and host-agnostic way. We observed that user innovation in this space happens through sharing GitHub repositories, which often address a shared and in-demand problem of the community.