Skip to content

Commit dd032ec

Browse files
committed
Release version 3.0.1
1 parent 2957d9d commit dd032ec

File tree

3 files changed

+23
-1
lines changed

3 files changed

+23
-1
lines changed

CHANGELOG.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,16 @@
11
# Changelog
22

3+
## Version 3.0.1
4+
5+
- Fixed bug with `https:` URLs defaulting to port `80` instead of `443` if no port is specified.
6+
Thanks to @dskvr for reporting
7+
8+
This affects comparing URLs with the default HTTPs port to URLs without it.
9+
For example, comparing `https://example.com/` to `https://example.com:443/` or vice versa.
10+
11+
They should be treated as equivalent but weren't due to the incorrect port
12+
being used for `https:`.
13+
314
## Version 3.0.0
415

516
- Changed to using global URL object instead of importing. – Thanks to @brendankenny

README.md

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -94,6 +94,17 @@ Returns the preferred host name specified by the `host:` directive or null if th
9494

9595
# Changes
9696

97+
### Version 3.0.1
98+
99+
- Fixed bug with `https:` URLs defaulting to port `80` instead of `443` if no port is specified.
100+
Thanks to @dskvr for reporting
101+
102+
This affects comparing URLs with the default HTTPs port to URLs without it.
103+
For example, comparing `https://example.com/` to `https://example.com:443/` or vice versa.
104+
105+
They should be treated as equivalent but weren't due to the incorrect port
106+
being used for `https:`.
107+
97108
### Version 3.0.0
98109

99110
- Changed to using global URL object instead of importing. – Thanks to @brendankenny

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "robots-parser",
3-
"version": "3.0.0",
3+
"version": "3.0.1",
44
"description": "A specification compliant robots.txt parser with wildcard (*) matching support.",
55
"keywords": [
66
"robots.txt",

0 commit comments

Comments
 (0)