#AI > [!warning] This is a **hypothetical** > Note this is a *hypothetical* consideration of a Moodle feature that does not exist **at all** yet! Moodle [IDEA-230](https://tracker.moodle.org/browse/IDEA-230) suggests the concept of AI Providers. These are a set of classes that implement & mediate access from Moodle to AI Provider such as OpenAI, or a local LLM such as Ollama. A Moodle system administrator would be expected to be able to install on their server an AI Provider plugin, and then create "instances" of it to link to a particular AI instance. The exact nature would depend on the AI provider, but it is conceivable that each instance of an OpenAI AI Provider could have it's own OpenAI API key. ![[AIProviderCreate.jpg]] A server may have a number of AI Provider instances, with their availability limited to particular Moodle scopes ![[AIProviderList.jpg]] In the example above there are 2 AIs available on the server, a "global AI" which can be used by any AI consumer and an "ollama" instance which can only be used by AI Consumers within the "Faculty of Science/Biomedical Engineering" category. AI Providers are important as they may also be used to specify which LLM is used with the underlying [[Combined Moodle Search and AI|Moodle Search/AI Content indexing system]], and that system would also use a (single) selected AI instance for generating embeddings, and to ensure that all user queries that are looking for Moodle AI Contexts are generated consistently. # AI Consumers Almost by definition, if we have AI "Providers", which offer up AI services, then there would be some form of "AI Consumer". Typically this would be some sort of Moodle plugin. For activity plugins these would then typically have instances in particular courses, other plugins may exist in a different context, such as being global, or being more contextual and relying on the context in which they are being viewed. Rather than allowing developers to be completely responsible for implementing their own AI, Moodle will offer an AI Provider configuration option similar to the OAuth Provider. Programmatically a developer can display a list of Providers on their plugin settings page to allow the server administrator to link that plugin to a particular AI provider, or the developer will be able to use an AI Provider configuration setting to set an AI Provider on a per-instance basis: ![[AddAIConsumer.jpg]] The linked LLM Provider definition's scope would constrain the information made available to the LLM. This information would also be forwarded to the Moodle Retrieval Subsystem to constrain what results were eligible to be returned to the AI Provider.