top of page
Search

Private Generative AI Realized with In-house Infrastructure - Greenative Supports Gemma 3 and MCP Integration

Greenative, a tool for building, operating, and managing open generative AI models, now supports Gemma 3.

In addition, by linking with the Model Context Protocol (MCP) compliant component “greenative.mcp”, a series of processes from generative AI to data processing can all be performed in a private environment by connecting to various databases and storage devices while utilizing the company's own infrastructure. This allows you to realize a series of processes from AI to data processing in a private environment.

Greenative : Gemma 3 and MCP integration
Greenative : Gemma 3 and MCP integration


Characteristics of Greenative and Gemma 3 integration


Greenative utilizes open generative AI models such as Gemma 3 and MCP, a standardized data access method, to link your own data with generative AI in a private environment.


  • Inquiry and data access to the generative AI can be completed within the company's own infrastructure environment (including public cloud), allowing the handling of highly confidential data.

  • Pre-configured greenative.mcp allows access to various databases and storage via predefined data sources (see here for details).

  • In response to natural language queries from users, Gemma generates the corresponding SQL and Greenative routes the process to the MCP as appropriate.

  • By leveraging Gemma 3, a high-performance generative AI model, advanced natural language processing can be achieved with minimal computing resources (the demo runs on an Apple M4 chip) rather than a large GPU cluster.

  • Greenative is a browser-based tool, not a desktop application, so multiple users can share the environment.


This video introduces a series of data acquisition and aggregation processes for natural language queries to Greenative using Gemma 3 and MCP.


Use Cases


Flexible data utilization in mixed cloud / on-premise environments

The cloud, such as AWS and Google Cloud, used by the company, and on-premise databases and storage are connected together and can be accessed centrally from Greenative. Various internal and external data can be searched and reported in natural language using generative AI, enabling flexible data utilization without location or system constraints.


Utilization of sensitive data

Data in the public cloud and information in mission-critical systems can be linked with generative AI while utilizing the existing infrastructure, enabling smooth incorporation of data analysis and automation into business operations.


Improving operational efficiency of local governments and public institutions

Administrative data stored in the local government's own cloud or internal servers can be utilized by Greenative and Gemma. Daily operations such as searching and aggregation of highly confidential data can be streamlined.


Search and summarize knowledge base and FAQ

AI search and summarization of internal documents, knowledge base, and FAQs in natural language. This helps improve the efficiency and quality of customer support and internal helpdesk support.


Rapid prototyping in the data analysis department

Quickly test new analysis and application ideas with generative AI using data in the cloud or on-premise, and since it runs on local environments such as Apple M4 chips, it can be easily deployed for PoC or small-scale deployments.



Deploying Greenative


Greenative can be deployed and used in the SaaS environment operated by Avidea and in public cloud environments or on-premise environments owned by companies and organizations. It can be customized according to the security policy and operational requirements of each company, and can be flexibly integrated with internal systems.

 
 
 

Comments


  • Facebook
  • Twitter
  • Linkedin
bottom of page