Home Common Problem Data flows are divided into several categories

Data flows are divided into several categories

Jun 29, 2020 pm 04:17 PM
data flow

Data streams are divided into two categories: 1. Input stream [InputStream], the input stream can only be read but not written, while the output stream can only be written but not read. 2. Output stream [OutputStream], usually the input stream is used in the program to read data, and the output stream is used to write data, just like the data flows into and out of the program.

Data flows are divided into several categories

#A data stream is an ordered sequence of bytes with a starting point and an ending point. Including input stream and output stream.

Data stream is originally a concept used in the field of communications and represents a sequence of digitally encoded signals used in transmission of information. This concept was first proposed by Henzinger in 1998 in Document 87. He defined a data stream as "a sequence of data that can only be read once in a predetermined order."

The development of data flow applications is the result of the following two factors:

Detailed data

It has been able to continuously and automatically generate large amounts of detailed data. This type of data first appeared in the traditional banking and stock trading fields, and later also appeared in geological surveying, meteorology, astronomical observation, etc. In particular, the emergence of the Internet (network traffic monitoring, clickstream) and wireless communication networks (call records) has generated a large amount of data stream type data. We noticed that most of this type of data is related to geographical information. This is mainly because geographical information has a large dimension and is easy to generate such large amounts of detailed data.

Complex Analysis

Requires complex analysis of update streams in near real-time. Complex analysis of data in the above fields (such as trend analysis, prediction) was often performed offline (in data warehouses). However, some new applications (especially in the fields of network security and national security) are very time-sensitive. , such as detecting extreme events, fraud, intrusion, anomalies on the Internet, complex crowd monitoring, track trend, exploratory analyses, harmonic analysis, etc., all require online analysis.

After that, the academic community basically recognized this definition, and some articles slightly modified the definition on this basis. For example, S. Guha et al. [88] believe that a data stream is "an ordered sequence of points that can only be read once or a few times", where the "one pass" restriction in the aforementioned definition is relaxed.

Why do we emphasize the limit on the number of data reads in the processing of data streams? S. Muthukrishnan [89] pointed out that data flow refers to “input data arriving at a very high speed”, so the transmission, calculation and storage of data flow data will become difficult. In this case, there is only one opportunity to process the data when it first arrives, and it is difficult to access the data at other times (because there is no and no way to save the data).

Classification:

The nature and format of the data are different, and the processing methods of the convection are also different. Therefore, in the Java input/output class library, There are different stream classes corresponding to input/output streams of different natures. In java. In the io package, the basic input/output stream classes can be divided into two types according to the types of data they read and write: byte streams and character streams.

Input stream and output stream

Data streams are divided into two categories: input stream (InputStream) and output stream (OutputStream). The input stream can only be read but not written, while the output stream can only be written but not read. Typically programs use input streams to read data and output streams to write data, just as data flows into and out of the program. Data flow is used to make the input and output operations of the program independent of related devices.

The input stream can obtain data from the keyboard or a file, and the output stream can transmit data to the monitor, printer, or file.

Buffered Stream

In order to improve data transmission efficiency, a buffered stream (Buffered Stream) is usually used, that is, a stream is equipped with a buffer (buffer), and a buffer is dedicated to Memory block to transfer data to. When writing data to a buffered stream, the system does not send directly to the external device, but instead sends the data to a buffer. The buffer automatically records data. When the buffer is full, the system sends all data to the corresponding device.

When reading data from a buffered stream, the system actually reads the data from the buffer. When the buffer is empty, the system automatically reads data from the relevant device and reads as much data as possible to fill the buffer.

The above is the detailed content of Data flows are divided into several categories. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

PHP and Apache NiFi integration for data flow management and processing PHP and Apache NiFi integration for data flow management and processing Jun 25, 2023 pm 12:25 PM

In the era of big data, data management and processing have become an important part of enterprise development. For data stream processing, ApacheNiFi is a leading open source data stream processing tool. The PHP language is very familiar to the development of websites and applications, so how to integrate PHP and Apache NiFi to achieve data flow management and processing? 1. Introduction to ApacheNiFi ApacheNiFi is a powerful, visual data stream processing tool. It can visualize data from various

React Data Flow Management Guide: How to handle front-end data flow gracefully React Data Flow Management Guide: How to handle front-end data flow gracefully Sep 26, 2023 pm 07:45 PM

React Data Flow Management Guide: How to handle front-end data flow gracefully Introduction: React is a very popular front-end development framework. It provides a component-based development method, making front-end development more modular and maintainable. However, when developing complex applications, managing data flow becomes important. This article will introduce some methods of elegantly handling data flow in React and demonstrate specific code examples. 1. One-way data flow React advocates the use of one-way data flow to manage data flow. one-way number

Data flow model design method in Go language Data flow model design method in Go language May 31, 2023 pm 11:21 PM

With the increasing number of Internet applications, data processing has become increasingly important. In order to better process data and improve system efficiency and reliability, data flow model design has become an important method. This article will introduce how to design a data flow model in Go language, including flow pipelines, grouping, filters, etc. Stream Pipes Stream pipes are a fundamental component of the data flow model and can pass data from one processing unit to another. In Go language, you can use channel as a pipeline, and channel supports data heterogeneity.

Swoole and Workerman's optimization methods for data transmission and data encryption between PHP and MySQL Swoole and Workerman's optimization methods for data transmission and data encryption between PHP and MySQL Oct 15, 2023 pm 02:55 PM

Swoole and Workerman's optimization method for data transmission and data encryption between PHP and MySQL. With the rapid development of the Internet, PHP, as a commonly used server-side programming language, is widely used in the field of Web development. In PHP applications, data transmission and data security have always been the focus of developers. In order to improve the efficiency of data transmission and protect data security, developers usually use some optimization methods. This article will focus on Swoole and Workerman, two commonly used

How to use PHP Stream to implement data flow operations How to use PHP Stream to implement data flow operations Mar 27, 2024 pm 04:51 PM

Title: PHPStream Data Stream Operation Guide In Web development, data stream operations are very common operations and can be used to read file content, send HTTP requests, process network data and other functions. PHP provides a powerful Stream function, making data flow operations easier and more convenient. This article will introduce how to use PHPStream to implement data flow operations and provide specific code examples for reference. 1. Basic concept In PHP, Stream is an abstract data stream that can

High-performance data stream processing technology in PHP High-performance data stream processing technology in PHP Jun 22, 2023 pm 01:17 PM

With the continuous growth of Internet applications and data volume, the speed requirements for data processing are also increasing. In the field of PHP development, high-performance data stream processing technology has become a necessary solution. This article will introduce and analyze high-performance data stream processing technology in PHP. 1. Principle of data stream processing In traditional data processing methods, the method of caching data into memory and then performing read and write operations is often used. However, when the amount of data is too large, it often causes problems such as memory overflow. Data stream processing technology is different, it will

Flume vs. Kafka: Which tool is better for handling your data flows? Flume vs. Kafka: Which tool is better for handling your data flows? Jan 31, 2024 pm 05:35 PM

FlumevsKafka: Which tool is better for your data stream processing? Overview Flume and Kafka are both popular data stream processing tools for collecting, aggregating and transmitting large amounts of real-time data. Both have the characteristics of high throughput, low latency, and reliability, but they have some differences in functionality, architecture, and applicable scenarios. FlumeFlume is a distributed, reliable and highly available data collection, aggregation and transmission system. It can collect data from various sources and then store it in HDFS,

How to use network programming functions in Java for network communication and data transmission How to use network programming functions in Java for network communication and data transmission Oct 27, 2023 pm 04:39 PM

How to use network programming functions in Java for network communication and data transmission Network communication is one of the most important applications in the modern computer field. In Java, we can use network programming functions to implement network communication and data transmission. This article will introduce how to use Java's network programming functions, including establishing TCP and UDP connections, and provide specific code examples. 1. Use TCP for network communication TCP (TransmissionControlProtocol) is a reliable