Table of Contents
4. Provide a continuously flexible funding model
Home Technology peripherals AI How to manage generative AI

How to manage generative AI

Mar 11, 2024 pm 12:04 PM
ai data Generative Sensitive data

How to manage generative AI

##Author丨Dom Couldwell

Compiled丨Noah

produced | 51CTO Technology Stack (WeChat ID: blog51cto)

According to McKinsey & Company estimates, generative artificial intelligence is expected to bring economic benefits of US$2.6 trillion to US$4.4 trillion to the global economy annually. This forecast is based on 63 new application scenarios that are expected to bring improvements, efficiency gains and new products to customers in multiple markets. This is undoubtedly a huge opportunity for developers and IT leaders.

The core of generative AI lies in data. Data not only gives generative AI the ability to understand and analyze the world around it, but also powers its transformative potential. To succeed in the field of generative AI, companies need to effectively manage and prepare data.

To successfully build and operate large-scale AI services and support generative AI projects, you need to ensure you do your homework on data preparation and adopt a smart and sustainable funding strategy . A slow pace and waning support approach will not lead to an advantage in artificial intelligence. Therefore, in addition to expanding the scale of AI services, it is also necessary to ensure stable funding sources for projects to promote long-term development and continuous innovation.

The huge potential of generative AI will be wasted if we don’t improve how we manage data or take the right approach to scale and cost control. . Here are some thoughts on how we can improve our approach to data management and support generative AI projects in the long term.

1. Where does the data come from?

Data exists in many forms. If used properly, each form of data can enhance the richness of generative AI insights. and quality.

The first form is structured data, which is organized in a regularly ordered and consistent manner and includes items such as product information, customer demographics, or inventory levels. This type of data provides an organized fact base that can be added to generative AI projects to improve the quality of responses.

In addition, you may have external data sources that complement your internal structured data sources, such as weather reports, stock prices, or traffic flows, etc. This data can bring real-time, real-world context to the decision-making process, and integrating it into a project can provide additional high-quality data, but it may not be necessary to generate this data yourself.

Another common data set is derived data, which covers data created through analysis and modeling scenarios. Such insights might include customer intent reports, seasonal sales forecasts, or segment analysis.

The last common form of data is unstructured data, which is different from the regular report or data formats analysts are used to. This type of data includes formats such as images, documents, and audio files. These data capture the nuances of human communication and expression. Generative AI programs often work around images or audio, which are common inputs and outputs for generative AI models.

2. To achieve large-scale application of generative AI

All these diverse data sets each exist in their own environment. To make it useful for generative AI projects, the key is to make this diverse data landscape accessible in real-time situations. With such a large amount of potential data involved, any approach must be able to dynamically scale as demand grows and replicate data globally, ensuring resources are close to users when requested, thus avoiding downtime and reducing latency in transaction requests. .

In addition, this data also needs to be preprocessed so that it can be effectively utilized by generative AI systems. This involves creating embeddings, which are mathematical values, or vectors, that represent semantic meaning. Embedding enables generative AI systems to go beyond specific text matching and instead embrace the meaning and context embedded in the data. Regardless of the original data form, creating embeddings means the data can be understood and used by generative AI systems while retaining its meaning and context.

With these embeddings, enterprises can support vector or hybrid searches across all their data while combining value and meaning. These results are then collected and passed back to a large-scale language model (LLM) that is used to integrate the results. By providing more data from multiple sources, rather than relying solely on LLM itself, your generative AI projects can provide users with more accurate results and reduce the risk of fictional content.

In order to achieve this in practice, the correct underlying data architecture must be chosen. In this process, data should be avoided as much as possible from being dispersed in different solutions to form a fragmented patchwork, because each such solution represents an island of data that requires long-term support, query, and management. Users should be able to quickly ask questions to the LLM and get responses quickly, rather than waiting for multiple components to respond and have their results weighed by the model. A unified data architecture should provide seamless data integration, allowing generative AI to fully utilize the entire available data spectrum.

3. Advantages of modular approach

To scale generative AI implementations, there needs to be a balance between accelerating adoption and maintaining control of critical assets. Taking a modular approach to building generative AI agents can make this process easier because it breaks down the implementation process and avoids potential bottlenecks.

Similar to the application of microservices design in applications, a modular approach to AI services also encourages best practices around application and software design, eliminating points of failure and enabling more Multiple potential users have access to this technology. This approach also makes it easier to monitor the performance of AI agents across the enterprise, pinpointing more precisely where problems occur.

The first benefit of modularity is interpretability, because the components involved in a generative AI system are separated from each other, making it easier to analyze how the agent operates and makes decisions. . AI is often viewed as a "black box," and modularity makes it easier to track and interpret results.

The second benefit is security, as individual components can be protected with optimal authentication and authorization mechanisms, ensuring that only authorized users can access sensitive data and functionality. Modularity also makes compliance and governance easier, as personally identifiable information (PII) or intellectual property (IP) can be secured and kept separate from the underlying LLM.

4. Provide a continuously flexible funding model

In addition to adopting a microservices approach, a platform mindset should be adopted in overall generative AI projects. This means replacing the traditional project-based software project funding model with one that provides an ongoing and flexible funding model. This approach empowers participants to make value-based decisions, respond to emerging opportunities, and develop best practices without being constrained by rigid funding cycles or business cases.

Managing budgets in this way also encourages developers and business teams to consider generative AI as part of the organization’s already existing infrastructure, making it easier to smooth out spikes in planning workloads and troughs, it’s easier to take a “center of excellence” approach and maintain consistency over the long term.

A similar approach is to regard generative AI as a product operated by the enterprise itself, rather than as pure software. AI agents should be managed as products, as this more effectively reflects the value they create and makes support resources for integrations, tools, and tips more readily available. Simplifying this model helps spread understanding of generative AI across the organization, promotes the adoption of best practices, and creates a culture of shared expertise and collaboration in generative AI development.

Generative AI has huge potential, and companies are racing to implement new tools, agents, and cues into their operations. However, moving these potential projects into production requires effective management of data, a foundation for scaling the system, and a budget model in place to support the team. Getting your processes right and prioritizing will help you and your team unlock the transformative potential of this technology.

Reference address: https://www.infoworld.com/article/3713461/how-to-manage-generative-ai.html

The above is the detailed content of How to manage generative AI. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

R.E.P.O. Energy Crystals Explained and What They Do (Yellow Crystal)
2 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
Repo: How To Revive Teammates
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌
Hello Kitty Island Adventure: How To Get Giant Seeds
4 weeks ago By 尊渡假赌尊渡假赌尊渡假赌

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to operate Zookeeper performance tuning on Debian How to operate Zookeeper performance tuning on Debian Apr 02, 2025 am 07:42 AM

This article describes how to optimize ZooKeeper performance on Debian systems. We will provide advice on hardware, operating system, ZooKeeper configuration and monitoring. 1. Optimize storage media upgrade at the system level: Replacing traditional mechanical hard drives with SSD solid-state drives will significantly improve I/O performance and reduce access latency. Disable swap partitioning: By adjusting kernel parameters, reduce dependence on swap partitions and avoid performance losses caused by frequent memory and disk swaps. Improve file descriptor upper limit: Increase the number of file descriptors allowed to be opened at the same time by the system to avoid resource limitations affecting the processing efficiency of ZooKeeper. 2. ZooKeeper configuration optimization zoo.cfg file configuration

How to do Oracle security settings on Debian How to do Oracle security settings on Debian Apr 02, 2025 am 07:48 AM

To strengthen the security of Oracle database on the Debian system, it requires many aspects to start. The following steps provide a framework for secure configuration: 1. Oracle database installation and initial configuration system preparation: Ensure that the Debian system has been updated to the latest version, the network configuration is correct, and all required software packages are installed. It is recommended to refer to official documents or reliable third-party resources for installation. Users and Groups: Create a dedicated Oracle user group (such as oinstall, dba, backupdba) and set appropriate permissions for it. 2. Security restrictions set resource restrictions: Edit /etc/security/limits.d/30-oracle.conf

What is the reason why pipeline files cannot be written when using Scapy crawler? What is the reason why pipeline files cannot be written when using Scapy crawler? Apr 02, 2025 am 06:45 AM

Discussion on the reasons why pipeline files cannot be written when using Scapy crawlers When learning and using Scapy crawlers for persistent data storage, you may encounter pipeline files...

How to recover Debian mail server How to recover Debian mail server Apr 02, 2025 am 07:33 AM

Detailed Steps for Restoring Debian Mail Server This article will guide you on how to restore Debian Mail Server. Before you begin, it is important to remember the importance of data backup. Recovery Steps: Backup Data: Be sure to back up all important email data and configuration files before performing any recovery operations. This will ensure that you have a fallback version when problems occur during the recovery process. Check log files: Check mail server log files (such as /var/log/mail.log) for errors or exceptions. Log files often provide valuable clues about the cause of the problem. Stop service: Stop the mail service to prevent further data corruption. Use the following command: su

What are the Zookeeper security policies under the Debian system? What are the Zookeeper security policies under the Debian system? Apr 02, 2025 am 07:45 AM

This article outlines the strategies for enhancing ZooKeeper security in Debian systems. These policies cover multiple aspects such as data protection, access control and overall system protection. Core Security Measures: Data Encryption: Ensuring the confidentiality of ZooKeeper data is crucial. This can be achieved in the following ways: Client encryption: Encryption on the client before the data is sent to the ZooKeeper server. Server-side encryption: The ZooKeeper server is responsible for encrypting and decrypting data. Transport Layer Security (TLS/SSL): Use the TLS/SSL protocol to encrypt all communications between the client and the server to prevent data from being stolen during transmission.

Do FastAPI and aiohttp share the same global event loop? Do FastAPI and aiohttp share the same global event loop? Apr 02, 2025 am 06:12 AM

Compatibility issues between Python asynchronous libraries In Python, asynchronous programming has become the process of high concurrency and I/O...

See all articles