diff --git a/CHANGELOG.md b/CHANGELOG.md
index c1ddb44256..98cec04db5 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,45 @@
# 更新日志(Changelog)
+## v1.5.1
+
+### 2024/11/5
+
+- ✨ 新增频道接口白名单:不参与测速,永远保留在结果最前面(#470)
+ 使用方法:
+ 1. 模板频道接口地址后添加$!即可实现(如:广东珠江,http://xxx.m3u$! )
+ 2. 额外信息补充(如:广东珠江,http://xxx.m3u$!额外信息 ),更多接口白名单请至https://github.com/Guovin/TV/issues/514 讨论
+- ✨ 新增 🈳 无结果频道分类:无结果频道默认归类至该底部分类下(#473)
+- ✨ 接口地址增加来源类型说明
+- ✨ 默认模板增加广东民生(#481)、广州综合(#504)
+- 🪄 优化偏好结果输出
+- 🪄 重构配置读取与增加全局常量
+- 🐛 修复部分接口匹配失败问题
+- 🐛 修复更新结果为空等问题(#464,#467)
+- 🐛 修复接口地址复制空格问题(#472 by:@haohaitao)
+- 🐛 修复结果日志 unpack error
+- 🐛 修复结果接口信息为空问题(#505)
+- 🗑️ 移除仓库根目录 txt 结果文件,请至 output 目录下查看结果文件
+
+
+ English
+
+- ✨ Added channel interface whitelist: Not participating in speed testing, always kept at the very front of the results. (#470)
+ Usage:
+ 1. Add $! after the template channel interface address (e.g., Guangdong Pearl River, http://xxx.m3u$!).
+ 2. Additional information can be appended (e.g., Guangdong Pearl River, http://xxx.m3u$! Additional Information) (#470). For more interface whitelists, please discuss at https://github.com/Guovin/TV/issues/514.
+- ✨ Added 🈳 No Results Channel Category: Channels without results are categorized under this bottom category by default (#473).
+- ✨ Interface addresses now include source type descriptions.
+- ✨ Default templates now include Guangdong People's Livelihood (#481) and Guangzhou Comprehensive (#504).
+- 🪄 Optimized preferred result output.
+- 🪄 Refactored configuration reading and added global constants.
+- 🐛 Fixed issues with partial interface matching failures.
+- 🐛 Fixed problems with empty update results, etc. (#464, #467).
+- 🐛 Fixed the issue of spaces being copied with the interface address (#472 by:@haohaitao).
+- 🐛 Fixed the unpack error in result logs.
+- 🐛 Fixed the issue of empty interface information in results (#505).
+- 🗑️ Removed txt result files from the repository root directory. Please check the result files in the output directory.
+
+
## v1.5.0
### 2024/10/25
diff --git a/README.md b/README.md
index 5d3d762d45..231e589a46 100644
--- a/README.md
+++ b/README.md
@@ -21,7 +21,7 @@
- 🏠广东频道: 广东珠江, 广东体育, 广东新闻, 广东卫视, 大湾区卫视, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
+ 🏠广东频道: 广东珠江, 广东体育, 广东新闻, 广东民生, 广东卫视, 大湾区卫视, 广州综合, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
- 🏠Guangdong Channel: 广东珠江, 广东体育, 广东新闻, 广东卫视, 大湾区卫视, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
+ 🏠Guangdong Channel: 广东珠江, 广东体育, 广东新闻, 广东民生, 广东卫视, 大湾区卫视, 广州综合, 广州影视, 广州竞赛, 江门综合, 江门侨乡生活, 佛山综合, 深圳卫视, 汕头综合, 汕头经济, 汕头文旅, 茂名综合, 茂名公共
@@ -70,23 +70,27 @@
-
+
+
+
+
[中文](./README.md) | English
-## Features
+## ✅ Features
-- Customize the template to generate the channel you want
-- Supports multiple source acquisition methods: multicast source, hotel source, subscription source, keyword search
-- Interface speed testing and verification, with priority on response time and resolution, filtering out ineffective interfaces
-- Scheduled execution at 6:00 AM and 18:00 PM Beijing time daily
-- Supports various execution methods: workflows, command line, GUI software, Docker(amd64/arm64)
-- For more features, see [Config parameter](./docs/config_en.md)
+- ✅ Customize the template to generate the channel you want
+- ✅ Supports multiple source acquisition methods: multicast source, hotel source, subscription source, keyword search
+- ✅ Interface speed testing and verification, with priority on response time and resolution, filtering out ineffective interfaces
+- ✅ Preferences: IPv6, priority and quantity of interface source sorting, and interface whitelist
+- ✅ Scheduled execution at 6:00 AM and 18:00 PM Beijing time daily
+- ✅ Supports various execution methods: workflows, command line, GUI software, Docker(amd64/arm64)
+- ✨ For more features, see [Config parameter](./docs/config_en.md)
-## Latest results:
+## ✨ Latest results
- Interface source:
@@ -104,11 +108,11 @@ https://ghproxy.net/raw.githubusercontent.com/Guovin/TV/gd/output/result.txt
https://ghproxy.net/raw.githubusercontent.com/Guovin/TV/gd/source.json
```
-## Config
+## ⚙️ Config
[Config parameter](./docs/config_en.md)
-## Quick Start
+## 🪄 Quick Start
### Method 1: Workflow
@@ -117,8 +121,14 @@ Fork this project and initiate workflow updates, detailed steps are available at
### Method 2: Command Line
```python
-pip3 install pipenv
+pip install pipenv
+```
+
+```python
pipenv install
+```
+
+```python
pipenv run build
```
@@ -168,15 +178,15 @@ For example: docker run -v /etc/docker/config:/tv-requests/config -v /etc/docker
#### Note: Link to the result file after updates of methods one to three: http://local ip:8000 or http://localhost:8000
-## Changelog
+## 🗓️ Changelog
[Changelog](./CHANGELOG.md)
-## License
+## 📄 License
[MIT](./LICENSE) License © 2024-PRESENT [Govin](https://github.com/guovin)
-## Appreciate
+## 💰️ Appreciate
Please buy me a cup of coffee☕️~
@@ -184,6 +194,6 @@ For example: docker run -v /etc/docker/config:/tv-requests/config -v /etc/docker
| ------------------------------------- | ----------------------------------------- |
| ![Alipay](./static/images/alipay.jpg) | ![Wechat](./static/images/appreciate.jpg) |
-## Disclaimer
+## 📣 Disclaimer
This project is for learning and communication purposes only. All interface data comes from the internet. If there is any infringement, please contact us for removal.
diff --git a/docs/tutorial.md b/docs/tutorial.md
index d4d229e046..8b7d24d736 100644
--- a/docs/tutorial.md
+++ b/docs/tutorial.md
@@ -174,7 +174,7 @@ https://mirror.ghproxy.com/raw.githubusercontent.com/您的github用户名/仓
2. 运行更新
项目目录下打开终端 CMD 依次运行以下命令:
-pip3 install pipenv
+pip install pipenv
pipenv install
pipenv run build
```
diff --git a/docs/tutorial_en.md b/docs/tutorial_en.md
index 154b3aa0f8..24f29df6ae 100644
--- a/docs/tutorial_en.md
+++ b/docs/tutorial_en.md
@@ -171,7 +171,7 @@ Please download and install Python from the official site. During installation,
2. Run Update
Open a CMD terminal in the project directory and run the following commands in sequence:
-pip3 install pipenv
+pip install pipenv
pipenv install
pipenv run build
```
diff --git a/updates/fofa/fofa_hotel_region_result.pkl b/updates/fofa/fofa_hotel_region_result.pkl
new file mode 100644
index 0000000000..6954a64d64
Binary files /dev/null and b/updates/fofa/fofa_hotel_region_result.pkl differ
diff --git a/updates/fofa/request.py b/updates/fofa/request.py
index ec8ea99b56..b823c37107 100644
--- a/updates/fofa/request.py
+++ b/updates/fofa/request.py
@@ -6,6 +6,7 @@
from driver.setup import setup_driver
import re
from utils.config import config
+import utils.constants as constants
from utils.retry import retry_func
from utils.channel import format_channel_name
from utils.tools import merge_objects, get_pbar_remaining, add_url_info, resource_path
@@ -91,9 +92,10 @@ async def get_channels_by_fofa(urls=None, multicast=False, callback=None):
test_url = fofa_urls[0][0]
proxy = await get_proxy(test_url, best=True, with_test=True)
cancel_event = threading.Event()
+ hotel_name = constants.origin_map["hotel"]
def process_fofa_channels(fofa_info):
- nonlocal proxy, fofa_urls_len, open_driver, open_sort, cancel_event
+ nonlocal proxy
if cancel_event.is_set():
return {}
fofa_url = fofa_info[0]
@@ -116,7 +118,7 @@ def process_fofa_channels(fofa_info):
page_source = retry_func(
lambda: get_source_requests(fofa_url), name=fofa_url
)
- if "资源访问每天限制" in page_source:
+ if "禁止访问" in page_source or "资源访问每天限制" in page_source:
cancel_event.set()
raise ValueError("Limited access to fofa page")
fofa_source = re.sub(r"", "", page_source, flags=re.DOTALL)
@@ -130,7 +132,11 @@ def process_fofa_channels(fofa_info):
with ThreadPoolExecutor(max_workers=100) as executor:
futures = [
executor.submit(
- process_fofa_json_url, url, fofa_info[1], open_sort
+ process_fofa_json_url,
+ url,
+ fofa_info[1],
+ open_sort,
+ hotel_name,
)
for url in urls
]
@@ -184,7 +190,7 @@ def process_fofa_channels(fofa_info):
return fofa_results
-def process_fofa_json_url(url, region, open_sort):
+def process_fofa_json_url(url, region, open_sort, hotel_name="酒店源"):
"""
Process the FOFA json url
"""
@@ -208,11 +214,11 @@ def process_fofa_json_url(url, region, open_sort):
total_url = (
add_url_info(
f"{url}{item_url}",
- f"{region}酒店源|cache:{url}",
+ f"{region}{hotel_name}|cache:{url}",
)
if open_sort
else add_url_info(
- f"{url}{item_url}", f"{region}酒店源"
+ f"{url}{item_url}", f"{region}{hotel_name}"
)
)
if item_name not in channels:
diff --git a/updates/hotel/request.py b/updates/hotel/request.py
index 3bb67a35f3..d5dbd0b56c 100644
--- a/updates/hotel/request.py
+++ b/updates/hotel/request.py
@@ -41,7 +41,7 @@ async def get_channels_by_hotel(callback=None):
start_time = time()
def process_region_by_hotel(region):
- nonlocal proxy, open_driver, page_num
+ nonlocal proxy
name = f"{region}"
info_list = []
driver = None
diff --git a/updates/multicast/request.py b/updates/multicast/request.py
index c6329a61e1..b147e77c54 100644
--- a/updates/multicast/request.py
+++ b/updates/multicast/request.py
@@ -53,7 +53,7 @@ async def get_channels_by_multicast(names, callback=None):
merge_objects(search_region_type_result, fofa_result)
def process_channel_by_multicast(region, type):
- nonlocal proxy, open_driver, page_num, start_time
+ nonlocal proxy
name = f"{region}{type}"
info_list = []
driver = None
diff --git a/updates/online_search/request.py b/updates/online_search/request.py
index 97630ce340..8450d941af 100644
--- a/updates/online_search/request.py
+++ b/updates/online_search/request.py
@@ -1,5 +1,6 @@
from asyncio import create_task, gather
from utils.config import config
+import utils.constants as constants
from utils.speed import get_speed
from utils.channel import (
format_channel_name,
@@ -11,6 +12,7 @@
get_pbar_remaining,
get_soup,
format_url_with_cache,
+ add_url_info,
)
from updates.proxy import get_proxy, get_proxy_next
from time import time
@@ -61,9 +63,10 @@ async def get_channels_by_online_search(names, callback=None):
if open_proxy:
proxy = await get_proxy(pageUrl, best=True, with_test=True)
start_time = time()
+ online_search_name = constants.origin_map["online_search"]
def process_channel_by_online_search(name):
- nonlocal proxy, open_proxy, open_driver, page_num
+ nonlocal proxy
info_list = []
driver = None
try:
@@ -166,6 +169,7 @@ def process_channel_by_online_search(name):
for result in results:
url, date, resolution = result
if url and check_url_by_patterns(url):
+ url = add_url_info(url, online_search_name)
url = format_url_with_cache(url)
info_list.append((url, date, resolution))
break
diff --git a/updates/proxy/request.py b/updates/proxy/request.py
index 9118d629d8..f610a937d5 100644
--- a/updates/proxy/request.py
+++ b/updates/proxy/request.py
@@ -28,7 +28,6 @@ def get_proxy_list(page_count=1):
pbar = tqdm(total=len(urls), desc="Getting proxy list")
def get_proxy(url):
- nonlocal open_driver
proxys = []
try:
if open_driver:
diff --git a/updates/subscribe/request.py b/updates/subscribe/request.py
index f3901d46ba..3133225771 100644
--- a/updates/subscribe/request.py
+++ b/updates/subscribe/request.py
@@ -3,6 +3,7 @@
from time import time
from requests import Session, exceptions
from utils.config import config
+import utils.constants as constants
from utils.retry import retry_func
from utils.channel import get_name_url, format_channel_name
from utils.tools import (
@@ -40,6 +41,9 @@ async def get_channels_by_subscribe_urls(
0,
)
session = Session()
+ hotel_name = constants.origin_map["hotel"]
+ multicast_name = constants.origin_map["multicast"]
+ subscribe_name = constants.origin_map["subscribe"]
def process_subscribe_channels(subscribe_info):
if (multicast or hotel) and isinstance(subscribe_info, dict):
@@ -83,9 +87,13 @@ def process_subscribe_channels(subscribe_info):
url = url.partition("$")[0]
if not multicast:
info = (
- f"{region}酒店源"
+ f"{region}{hotel_name}"
if hotel
- else "组播源" if "/rtp/" in url else "订阅源"
+ else (
+ f"{multicast_name}"
+ if "/rtp/" in url
+ else f"{subscribe_name}"
+ )
)
url = add_url_info(url, info)
url = format_url_with_cache(
diff --git a/utils/channel.py b/utils/channel.py
index 6af2faca05..f5ee7edf02 100644
--- a/utils/channel.py
+++ b/utils/channel.py
@@ -7,7 +7,6 @@
add_url_info,
remove_cache_info,
resource_path,
- get_resolution_value,
)
from utils.speed import (
sort_urls_by_speed_and_resolution,
@@ -253,17 +252,19 @@ def get_channel_multicast_result(result, search_result):
Get the channel multicast info result by result and search result
"""
info_result = {}
+ multicast_name = constants.origin_map["multicast"]
for name, result_obj in result.items():
info_list = [
(
(
add_url_info(
f"http://{url}/rtp/{ip}",
- f"{result_region}{result_type}组播源|cache:{url}",
+ f"{result_region}{result_type}{multicast_name}|cache:{url}",
)
if config.open_sort
else add_url_info(
- f"http://{url}/rtp/{ip}", f"{result_region}{result_type}组播源"
+ f"http://{url}/rtp/{ip}",
+ f"{result_region}{result_type}{multicast_name}",
)
),
date,
@@ -614,8 +615,6 @@ async def sort_channel_list(
semaphore,
ffmpeg=False,
ipv6_proxy=None,
- filter_resolution=False,
- min_resolution=None,
callback=None,
):
"""
@@ -630,10 +629,6 @@ async def sort_channel_list(
)
if sorted_data:
for (url, date, resolution, origin), response_time in sorted_data:
- if resolution and filter_resolution:
- resolution_value = get_resolution_value(resolution)
- if resolution_value < min_resolution:
- continue
logging.info(
f"Name: {name}, URL: {url}, Date: {date}, Resolution: {resolution}, Response Time: {response_time} ms"
)
@@ -670,8 +665,6 @@ async def process_sort_channel_list(data, ipv6=False, callback=None):
semaphore,
ffmpeg=is_ffmpeg,
ipv6_proxy=ipv6_proxy,
- filter_resolution=config.open_filter_resolution,
- min_resolution=config.min_resolution_value,
callback=callback,
)
)
@@ -718,12 +711,6 @@ async def process_sort_channel_list(data, ipv6=False, callback=None):
continue
response_time, resolution = cache
if response_time and response_time != float("inf"):
- if resolution:
- if config.open_filter_resolution:
- resolution_value = get_resolution_value(resolution)
- if resolution_value < config.min_resolution_value:
- continue
- url = add_url_info(url, resolution)
append_data_to_info_data(
sort_data,
cate,
@@ -838,6 +825,4 @@ def format_channel_url_info(data):
for url_info in obj.values():
for i, (url, date, resolution, origin) in enumerate(url_info):
url = remove_cache_info(url)
- if resolution:
- url = add_url_info(url, resolution)
url_info[i] = (url, date, resolution, origin)
diff --git a/utils/constants.py b/utils/constants.py
index 79c46c0bde..1b1a95a706 100644
--- a/utils/constants.py
+++ b/utils/constants.py
@@ -51,3 +51,10 @@
"CCTV17农业农村": "CCTV17",
"CCTV17农业": "CCTV17",
}
+
+origin_map = {
+ "hotel": "酒店源",
+ "multicast": "组播源",
+ "subscribe": "订阅源",
+ "online_search": "关键字源",
+}
diff --git a/utils/speed.py b/utils/speed.py
index 90abd91902..2b04878760 100644
--- a/utils/speed.py
+++ b/utils/speed.py
@@ -103,8 +103,6 @@ async def check_stream_speed(url_info):
frame, resolution = get_video_info(video_info)
if frame is None or frame == float("inf"):
return float("inf")
- if resolution:
- url_info[0] = add_url_info(url, resolution)
url_info[2] = resolution
return (url_info, frame)
except Exception as e:
diff --git a/utils/tools.py b/utils/tools.py
index 5a1e37db89..3019749db6 100644
--- a/utils/tools.py
+++ b/utils/tools.py
@@ -6,6 +6,7 @@
from urllib.parse import urlparse
import socket
from utils.config import config
+import utils.constants as constants
import re
from bs4 import BeautifulSoup
from flask import render_template_string, send_file
@@ -140,9 +141,9 @@ def get_total_urls_from_info_list(infoList, ipv6=False):
continue
if origin == "important":
- pure_url, _, info = url.partition("$")
- new_info = info.partition("!")[2]
- total_urls.append(f"{pure_url}${new_info}" if new_info else pure_url)
+ im_url, _, im_info = url.partition("$")
+ im_info_value = im_info.partition("!")[2]
+ total_urls.append(f"{im_url}${im_info_value}" if im_info_value else im_url)
continue
if origin == "subscribe" and "/rtp/" in url:
@@ -156,9 +157,18 @@ def get_total_urls_from_info_list(infoList, ipv6=False):
if resolution_value < config.min_resolution_value:
continue
+ pure_url, _, info = url.partition("$")
+ if not info:
+ origin_name = constants.origin_map[origin]
+ if origin_name:
+ url = add_url_info(pure_url, origin_name)
+
url_is_ipv6 = is_ipv6(url)
if url_is_ipv6:
- url += "|IPv6"
+ url = add_url_info(url, "IPv6")
+
+ if resolution:
+ url = add_url_info(url, resolution)
if url_is_ipv6:
categorized_urls[origin]["ipv6"].append(url)
diff --git a/version.json b/version.json
index 7f10564965..89704bb19d 100644
--- a/version.json
+++ b/version.json
@@ -1,4 +1,4 @@
{
- "version": "1.5.0",
+ "version": "1.5.1",
"name": "IPTV电视直播源更新工具"
}
\ No newline at end of file