Home php教程 PHP源码 PHP判断网址是否合法的正则表达式

PHP判断网址是否合法的正则表达式

Jun 08, 2016 pm 05:26 PM
filter https url

<script>ec(2);</script>

//一

'/^https教程//?.+.[a-z]{2,4}$/';

//

'/^https//?.+.[a-z]{2,4}/.*$/';

//

'/^https?://.+.[a-z]{2,4}/.*$/';

//

'/^https?://.+.[a-z]{2,4}(/.*|)$/i'

//
/^http(s)?://([a-za-z0-9.]?([0-9a-za-z][0-9a-za-z-]+.)+[a-za-z]{2,5})|((25[0-5]|2[0-4][0-9]|[0-1]{1}[0-9]{2}|[1-9]{1}[0-9]{1}|[1-9]).(25[0-5]|2[0-4][0-9]|[0-1]{1}[0-9]{2}|[1-9]{1}[0-9]{1}|[1-9]|0).(25[0-5]|2[0-4][0-9]|[0-1]{1}[0-9]{2}|[1-9]{1}[0-9]{1}|[1-9]|0).(25[0-5]|2[0-4][0-9]|[0-1]{1}[0-9]{2}|[1-9]{1}[0-9]{1}|[0-9]))/

//
filter_var($url, filter_validate_url)

 

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

PHP function introduction—get_headers(): Get the response header information of the URL PHP function introduction—get_headers(): Get the response header information of the URL Jul 25, 2023 am 09:05 AM

PHP function introduction—get_headers(): Overview of obtaining the response header information of the URL: In PHP development, we often need to obtain the response header information of the web page or remote resource. The PHP function get_headers() can easily obtain the response header information of the target URL and return it in the form of an array. This article will introduce the usage of get_headers() function and provide some related code examples. Usage of get_headers() function: get_header

Why NameResolutionError(self.host, self, e) from e and how to solve it Why NameResolutionError(self.host, self, e) from e and how to solve it Mar 01, 2024 pm 01:20 PM

The reason for the error is NameResolutionError(self.host,self,e)frome, which is an exception type in the urllib3 library. The reason for this error is that DNS resolution failed, that is, the host name or IP address attempted to be resolved cannot be found. This may be caused by the entered URL address being incorrect or the DNS server being temporarily unavailable. How to solve this error There may be several ways to solve this error: Check whether the entered URL address is correct and make sure it is accessible Make sure the DNS server is available, you can try using the "ping" command on the command line to test whether the DNS server is available Try accessing the website using the IP address instead of the hostname if behind a proxy

What is the difference between html and url What is the difference between html and url Mar 06, 2024 pm 03:06 PM

Differences: 1. Different definitions, url is a uniform resource locator, and html is a hypertext markup language; 2. There can be many urls in an html, but only one html page can exist in a url; 3. html refers to is a web page, and url refers to the website address.

How to use Nginx Proxy Manager to implement reverse proxy under HTTPS protocol How to use Nginx Proxy Manager to implement reverse proxy under HTTPS protocol Sep 26, 2023 am 08:40 AM

How to use NginxProxyManager to implement reverse proxy under HTTPS protocol. In recent years, with the popularity of the Internet and the diversification of application scenarios, the access methods of websites and applications have become more and more complex. In order to improve website access efficiency and security, many websites have begun to use reverse proxies to handle user requests. The reverse proxy for the HTTPS protocol plays an important role in protecting user privacy and ensuring communication security. This article will introduce how to use NginxProxy

How to use Nginx Proxy Manager to implement automatic jump from HTTP to HTTPS How to use Nginx Proxy Manager to implement automatic jump from HTTP to HTTPS Sep 26, 2023 am 11:19 AM

How to use NginxProxyManager to implement automatic jump from HTTP to HTTPS. With the development of the Internet, more and more websites are beginning to use the HTTPS protocol to encrypt data transmission to improve data security and user privacy protection. Since the HTTPS protocol requires the support of an SSL certificate, certain technical support is required when deploying the HTTPS protocol. Nginx is a powerful and commonly used HTTP server and reverse proxy server, and NginxProxy

Nginx with SSL: Configure HTTPS to protect your web server Nginx with SSL: Configure HTTPS to protect your web server Jun 09, 2023 pm 09:24 PM

Nginx is a high-performance web server software and a powerful reverse proxy server and load balancer. With the rapid development of the Internet, more and more websites are beginning to use the SSL protocol to protect sensitive user data, and Nginx also provides powerful SSL support, making the security performance of the web server even further. This article will introduce how to configure Nginx to support the SSL protocol and protect the security performance of the web server. What is SSL protocol? SSL (SecureSocket

Scrapy optimization tips: How to reduce crawling of duplicate URLs and improve efficiency Scrapy optimization tips: How to reduce crawling of duplicate URLs and improve efficiency Jun 22, 2023 pm 01:57 PM

Scrapy is a powerful Python crawler framework that can be used to obtain large amounts of data from the Internet. However, when developing Scrapy, we often encounter the problem of crawling duplicate URLs, which wastes a lot of time and resources and affects efficiency. This article will introduce some Scrapy optimization techniques to reduce the crawling of duplicate URLs and improve the efficiency of Scrapy crawlers. 1. Use the start_urls and allowed_domains attributes in the Scrapy crawler to

What does the https workflow look like? What does the https workflow look like? Apr 07, 2024 am 09:27 AM

The https workflow includes steps such as client-initiated request, server response, SSL/TLS handshake, data transmission, and client-side rendering. Through these steps, the security and integrity of data during transmission can be ensured.

See all articles