How to transcode Chinese php url
php URL Chinese transcoding method: 1. Use the "urlencode" function to encode the URL, the syntax is "urlencode (parameter)"; 2. Use the "urldecode" function to decode the URL, the syntax is "urldecode(parameter)".
php url Chinese transcoding
Some novice friends may be a little unfamiliar with the concept of url encoding and decoding. . But if we put it this way, when we browse major web pages, we may find that some URLs have some special symbols such as #, &, _ or Chinese characters, etc. In order to comply with the URL specifications, there are URLs with these symbols. It needs to be encoded. With this simple explanation, do you know something about url encoding and decoding?
URL encoding and decoding, also called percent encoding, is a Uniform Resource Locator (URL) encoding method.
Below we combine simple code examples to introduce how PHP implements encoding and decoding of Chinese characters in URLs.
1. URL encoding (urlencode)
<?php $url = urlencode('PHP中文网');//把 PHP中文网 进行编码 $password = md5(123123); echo "index.php?user=$url&password=$password" ;
In this code, we use the urlencode function to encode the "PHP Chinese website", and use md5 encryption, and finally output the spliced url for testing.
The result is as shown below:
As shown in the figure, the three Chinese characters "中文网" were successfully encoded, all with percent signs, numbers and letters. Displayed in combined form. And the original password "123123" has also been encrypted.
Note: urlencode represents the encoded URL string
The urlencode return value represents the return string. All non-alphanumeric characters in this string except -_. will be replaced with hundreds. A semicolon (%) is followed by two hexadecimal digits, and spaces are encoded as plus signs ( ).
2. url decoding (urldecode)
<?php $url = urlencode('PHP中文网');//把 PHP中文网 进行编码 $password = md5(123123); echo "index.php?user=$url&password=$password" . '<br>'; $url = urldecode("%E4%B8%AD%E6%96%87%E7%BD%91"); // 把编码还原成 PHP中文网 echo $url;
Then we decode the encoded Chinese characters. Here we mainly use the urldecode function in PHP.
The test results are as follows:
As shown in the figure, the garbled characters behind PHP have been decoded into Chinese.
Note: urldecode means decoding the encoded URL string
The return value of urldecode means returning the decoded string.
For url encoding and decoding methods, you mainly need to master the two functions urlencode and urldecode.
URL encoding is mainly to comply with URL specifications or prevent SQL injection.
For more related knowledge, please visit PHP Chinese website!
The above is the detailed content of How to transcode Chinese php url. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

PHP function introduction—get_headers(): Overview of obtaining the response header information of the URL: In PHP development, we often need to obtain the response header information of the web page or remote resource. The PHP function get_headers() can easily obtain the response header information of the target URL and return it in the form of an array. This article will introduce the usage of get_headers() function and provide some related code examples. Usage of get_headers() function: get_header

Nowadays, many Windows users who love games have entered the Steam client and can search, download and play any good games. However, many users' profiles may have the exact same name, making it difficult to find a profile or even link a Steam profile to other third-party accounts or join Steam forums to share content. The profile is assigned a unique 17-digit id, which remains the same and cannot be changed by the user at any time, whereas the username or custom URL can. Regardless, some users don't know their Steamid, and it's important to know this. If you don't know how to find your account's Steamid, don't panic. In this article

The reason for the error is NameResolutionError(self.host,self,e)frome, which is an exception type in the urllib3 library. The reason for this error is that DNS resolution failed, that is, the host name or IP address attempted to be resolved cannot be found. This may be caused by the entered URL address being incorrect or the DNS server being temporarily unavailable. How to solve this error There may be several ways to solve this error: Check whether the entered URL address is correct and make sure it is accessible Make sure the DNS server is available, you can try using the "ping" command on the command line to test whether the DNS server is available Try accessing the website using the IP address instead of the hostname if behind a proxy

Use url to encode and decode the class java.net.URLDecoder.decode(url, decoding format) decoder.decoding method for encoding and decoding. Convert into an ordinary string, URLEncoder.decode(url, encoding format) turns the ordinary string into a string in the specified format packagecom.zixue.springbootmybatis.test;importjava.io.UnsupportedEncodingException;importjava.net.URLDecoder;importjava.net. URLEncoder

Differences: 1. Different definitions, url is a uniform resource locator, and html is a hypertext markup language; 2. There can be many urls in an html, but only one html page can exist in a url; 3. html refers to is a web page, and url refers to the website address.

Scrapy is a powerful Python crawler framework that can be used to obtain large amounts of data from the Internet. However, when developing Scrapy, we often encounter the problem of crawling duplicate URLs, which wastes a lot of time and resources and affects efficiency. This article will introduce some Scrapy optimization techniques to reduce the crawling of duplicate URLs and improve the efficiency of Scrapy crawlers. 1. Use the start_urls and allowed_domains attributes in the Scrapy crawler to

Preface In some cases, the prefixes in the service controller are consistent. For example, the prefix of all URLs is /context-path/api/v1, and a unified prefix needs to be added to some URLs. The conceivable solution is to modify the context-path of the service and add api/v1 to the context-path. Modifying the global prefix can solve the above problem, but there are disadvantages. If the URL has multiple prefixes, for example, some URLs require prefixes. If it is api/v2, it cannot be distinguished. If you do not want to add api/v1 to some static resources in the service, it cannot be distinguished. The following uses custom annotations to uniformly add certain URL prefixes. one,

The system in this article: centos6.5_x64 three hosts: nginx host, hostname:master.lansgg.comip:192.168.10.128 apache host, hostname:client1.lansgg.comip:192.168.10.129 1. nginx address redirection 2. nginx reverse proxy 1. Address redirection: refers to the technology that directs a user to another URL when he browses a certain URL. It is often used to convert a long URL into a shorter URL. Because when a website is to be disseminated, it is often because the URL is too long and difficult to remember; or it may be because the free web space of the Internet has changed.
