feat: add multi-market analysis and sqlite-backed reporting
This commit is contained in:
36
SKILL.md
36
SKILL.md
@@ -1,18 +1,19 @@
|
||||
---
|
||||
name: stockbuddy
|
||||
description: 港股分析助手,提供港股技术面和基本面综合分析,给出买入/卖出操作建议。支持单只股票查询分析、持仓批量分析和持仓管理。当用户提到"港股"、"股票分析"、"持仓分析"、"买入建议"、"卖出建议"、港股代码(如 0700.HK、700)或港股公司名称(如腾讯、阿里、比亚迪)时触发此技能。
|
||||
description: 多市场股票分析助手,提供 A 股、港股、美股的技术面和基础估值分析,给出买入/卖出操作建议。支持单只股票查询分析、持仓批量分析、关注股票管理和持仓管理。当用户提到"股票分析"、"持仓分析"、"关注股票"、"买入建议"、"卖出建议",或提供 A 股代码(如 600519 / SH600519)、港股代码(如 0700.HK / 700)、美股代码(如 AAPL / TSLA)时触发此技能。
|
||||
---
|
||||
|
||||
# 港股分析助手 (StockBuddy)
|
||||
# 多市场股票分析助手 (StockBuddy)
|
||||
|
||||
## 概述
|
||||
|
||||
港股技术面与基本面综合分析工具,输出量化评分和明确操作建议(强烈买入/买入/持有/卖出/强烈卖出)。
|
||||
A 股、港股、美股的技术面与基础估值综合分析工具,输出量化评分和明确操作建议(强烈买入/买入/持有/卖出/强烈卖出)。
|
||||
|
||||
三大核心场景:
|
||||
1. **单只股票分析** — 对指定港股进行完整技术面+基本面分析,给出操作建议
|
||||
四大核心场景:
|
||||
1. **单只股票分析** — 对指定股票进行完整技术面+基本面分析,给出操作建议
|
||||
2. **持仓批量分析** — 对用户所有持仓股票批量分析,给出各股操作建议和整体盈亏统计
|
||||
3. **持仓管理** — 增删改查持仓记录
|
||||
4. **关注池管理** — 增删改查关注股票,并记录股票基本信息
|
||||
|
||||
## 环境准备
|
||||
|
||||
@@ -22,7 +23,7 @@ description: 港股分析助手,提供港股技术面和基本面综合分析
|
||||
bash {{SKILL_DIR}}/scripts/install_deps.sh
|
||||
```
|
||||
|
||||
所需依赖:`numpy`、`pandas`(无需 yfinance,已改用腾讯财经数据源)
|
||||
所需依赖:`numpy`、`pandas`、Python 内置 `sqlite3`(无需 yfinance,已改用腾讯财经数据源)
|
||||
|
||||
## 核心工作流
|
||||
|
||||
@@ -33,9 +34,10 @@ bash {{SKILL_DIR}}/scripts/install_deps.sh
|
||||
**步骤:**
|
||||
|
||||
1. **识别股票代码**
|
||||
- 用户提供数字代码 → 标准化为 `XXXX.HK` 格式(自动补零)
|
||||
- 用户提供中文名称 → 查阅 `references/hk_stock_codes.md` 匹配对应代码
|
||||
- 无法匹配时 → 询问用户确认具体代码
|
||||
- 港股:标准化为 `XXXX.HK`
|
||||
- A 股:标准化为 `SH600519` / `SZ000001`
|
||||
- 美股:标准化为 `AAPL` / `TSLA`
|
||||
- 用户提供中文名称时,可先根据上下文判断市场;无法唯一匹配时再向用户确认
|
||||
|
||||
2. **执行分析脚本**
|
||||
```bash
|
||||
@@ -43,7 +45,7 @@ bash {{SKILL_DIR}}/scripts/install_deps.sh
|
||||
```
|
||||
可选周期参数:`1mo` / `3mo` / `6mo`(默认)/ `1y` / `2y` / `5y`
|
||||
|
||||
**缓存机制**:脚本内置 10 分钟缓存,同一股票短时间内重复分析不会重复请求腾讯财经。若用户明确要求"刷新数据"或"重新分析",加 `--no-cache` 参数强制刷新。清除所有缓存:`--clear-cache`。
|
||||
**数据与缓存机制**:原始日线 K 线、关注池、持仓数据统一保存在 `~/.stockbuddy/stockbuddy.db`(SQLite)。持仓记录通过 `watchlist_id` 关联关注股票主键。分析结果单独写入 SQLite 缓存表,默认 TTL 为 10 分钟,写入时自动清理过期缓存,并将总缓存条数控制在 1000 条以内。若用户明确要求"刷新数据"或"重新分析",加 `--no-cache` 参数强制刷新。清除分析缓存:`--clear-cache`。
|
||||
|
||||
3. **解读并呈现结果**
|
||||
- 脚本输出 JSON 格式分析数据
|
||||
@@ -60,6 +62,7 @@ bash {{SKILL_DIR}}/scripts/install_deps.sh
|
||||
```bash
|
||||
python3 {{SKILL_DIR}}/scripts/portfolio_manager.py list
|
||||
```
|
||||
持仓数据保存在 `~/.stockbuddy/stockbuddy.db` 的 `positions` 表。
|
||||
|
||||
2. **持仓为空时** → 引导用户添加持仓(参见场景三的添加操作)
|
||||
|
||||
@@ -84,7 +87,18 @@ bash {{SKILL_DIR}}/scripts/install_deps.sh
|
||||
| 更新 | `python3 {{SKILL_DIR}}/scripts/portfolio_manager.py update <代码> [--price <价格>] [--shares <数量>] [--note <备注>]` |
|
||||
| 移除 | `python3 {{SKILL_DIR}}/scripts/portfolio_manager.py remove <代码>` |
|
||||
|
||||
添加持仓时,若用户未提供日期,默认使用当天日期。若用户提供了自然语言信息(如"我上周花 350 买了 100 股腾讯"),提取价格、数量、日期等参数后执行命令。
|
||||
添加持仓时会自动确保该股票存在于关注池,并通过 `positions.watchlist_id -> watchlist.id` 关联。若用户未提供日期,默认使用当天日期。若用户提供了自然语言信息(如"我上周花 350 买了 100 股腾讯"),提取价格、数量、日期等参数后执行命令。
|
||||
|
||||
### 场景四:关注池管理
|
||||
|
||||
触发示例:"关注腾讯"、"把苹果加到关注列表"、"取消关注茅台"
|
||||
|
||||
| 操作 | 命令 |
|
||||
|------|------|
|
||||
| 查看关注池 | `python3 {{SKILL_DIR}}/scripts/portfolio_manager.py watch-list` |
|
||||
| 添加关注 | `python3 {{SKILL_DIR}}/scripts/portfolio_manager.py watch-add <代码>` |
|
||||
| 取消关注 | `python3 {{SKILL_DIR}}/scripts/portfolio_manager.py watch-remove <代码>` |
|
||||
|
||||
|
||||
## 分析方法论
|
||||
|
||||
|
||||
@@ -5,30 +5,50 @@
|
||||
```
|
||||
## 📊 {公司名称} ({股票代码}) 分析报告
|
||||
|
||||
**当前价格**: HK$ {价格} ({涨跌幅}%)
|
||||
**市场**: {市场} | **交易所**: {交易所} | **币种**: {币种}
|
||||
**当前价格**: {币种符号}{价格} ({涨跌幅}%)
|
||||
**分析时间**: {时间}
|
||||
**数据周期**: {周期}
|
||||
|
||||
---
|
||||
|
||||
### {建议图标} 操作建议: {操作建议}
|
||||
**综合评分**: {评分}/10
|
||||
**动作类型**: {动作类型}
|
||||
**综合评分**: {评分}
|
||||
**置信度**: {置信度等级} ({置信度分数})
|
||||
**市场场景**: {市场场景}
|
||||
|
||||
#### 核心信号:
|
||||
{逐条列出关键信号,每条一行,用 - 前缀}
|
||||
#### 核心信号
|
||||
- {核心信号1}
|
||||
- {核心信号2}
|
||||
- {核心信号3}
|
||||
- {核心信号4}
|
||||
- {核心信号5}
|
||||
- {核心信号6}
|
||||
|
||||
---
|
||||
|
||||
### 📈 技术面分析
|
||||
### 📈 多层评分
|
||||
|
||||
| 评分层 | 分数 | 解读 |
|
||||
|------|------|------|
|
||||
| 趋势层 | {趋势层} | {趋势层解读} |
|
||||
| 动量层 | {动量层} | {动量层解读} |
|
||||
| 风险层 | {风险层} | {风险层解读} |
|
||||
| 估值层 | {估值层} | {估值层解读} |
|
||||
| 相对强弱 | {相对强弱} | {相对强弱解读} |
|
||||
| 量价结构 | {量价结构} | {量价结构解读} |
|
||||
|
||||
### 📉 技术面细节
|
||||
|
||||
| 指标 | 数值 | 信号 |
|
||||
|------|------|------|
|
||||
| 均线趋势 | {均线排列} | {信号} |
|
||||
| MACD | DIF:{DIF} DEA:{DEA} | {信号} |
|
||||
| RSI(12) | {RSI值} | {信号} |
|
||||
| KDJ | K:{K} D:{D} J:{J} | {信号} |
|
||||
| 布林带 | 上:{上轨} 中:{中轨} 下:{下轨} | {信号} |
|
||||
| 成交量 | 量比:{量比} | {信号} |
|
||||
| 均线趋势 | {均线排列} | {均线信号} |
|
||||
| MACD | DIF:{DIF} DEA:{DEA} MACD:{MACD} | {MACD信号} |
|
||||
| RSI | RSI6:{RSI6} RSI12:{RSI12} RSI24:{RSI24} | {RSI信号} |
|
||||
| KDJ | K:{K} D:{D} J:{J} | {KDJ信号} |
|
||||
| 布林带 | 上:{上轨} 中:{中轨} 下:{下轨} | {布林带信号} |
|
||||
| 成交量 | 量比:{量比} | {成交量信号} |
|
||||
|
||||
### 📋 基本面概况
|
||||
|
||||
@@ -36,14 +56,27 @@
|
||||
|------|------|
|
||||
| 市盈率(PE) | {PE} |
|
||||
| 市净率(PB) | {PB} |
|
||||
| 股息率 | {股息率}% |
|
||||
| ROE | {ROE}% |
|
||||
| 收入增长 | {增长}% |
|
||||
| 市值 | {市值} |
|
||||
| 52周区间 | {低} - {高} |
|
||||
| 52周高点 | {52周高点} |
|
||||
| 52周低点 | {52周低点} |
|
||||
| 52周位置 | {52周位置} |
|
||||
| 基本面判断 | {基本面判断} |
|
||||
|
||||
### 🧪 历史验证
|
||||
|
||||
| 指标 | 数值 |
|
||||
|------|------|
|
||||
| 相似样本数 | {样本数} |
|
||||
| 5日平均收益 | {5日平均收益}% |
|
||||
| 5日胜率 | {5日胜率}% |
|
||||
| 10日平均收益 | {10日平均收益}% |
|
||||
| 10日胜率 | {10日胜率}% |
|
||||
| 20日平均收益 | {20日平均收益}% |
|
||||
| 20日胜率 | {20日胜率}% |
|
||||
| 回撤代理 | {回撤代理}% |
|
||||
|
||||
### 💡 分析总结
|
||||
{2-3句话的自然语言总结,包含操作建议和风险提示}
|
||||
{2-4句话的自然语言总结,至少包含:当前市场场景、操作建议、置信度、主要支撑/风险点。若历史验证样本不足,要明确提醒。}
|
||||
|
||||
> ⚠️ 以上分析仅供参考,不构成投资建议。投资有风险,入市需谨慎。
|
||||
```
|
||||
@@ -60,9 +93,9 @@
|
||||
|
||||
| 指标 | 数值 |
|
||||
|------|------|
|
||||
| 总成本 | HK$ {总成本} |
|
||||
| 总市值 | HK$ {总市值} |
|
||||
| 总盈亏 | HK$ {盈亏} ({盈亏比例}%) |
|
||||
| 总成本 | {总成本} |
|
||||
| 总市值 | {总市值} |
|
||||
| 总盈亏 | {盈亏} ({盈亏比例}%) |
|
||||
|
||||
---
|
||||
|
||||
@@ -71,10 +104,12 @@
|
||||
{对每只股票输出简要分析卡片,格式如下:}
|
||||
|
||||
#### {序号}. {公司名称} ({股票代码}) — {操作建议图标} {操作建议}
|
||||
- **当前价**: HK$ {当前价} | **买入价**: HK$ {买入价}
|
||||
- **持仓数量**: {数量}股 | **盈亏**: HK$ {盈亏} ({盈亏比例}%)
|
||||
- **综合评分**: {评分}/10
|
||||
- **关键信号**: {1-2条最重要的信号}
|
||||
- **市场/币种**: {市场} / {币种}
|
||||
- **动作类型**: {动作类型} | **场景**: {市场场景}
|
||||
- **当前价**: {当前价} | **买入价**: {买入价}
|
||||
- **持仓数量**: {数量}股 | **盈亏**: {盈亏} ({盈亏比例}%)
|
||||
- **综合评分**: {评分} | **置信度**: {置信度等级} ({置信度分数})
|
||||
- **核心信号**: {1-3条最重要的信号}
|
||||
|
||||
---
|
||||
|
||||
@@ -82,15 +117,18 @@
|
||||
{综合所有持仓的建议,明确指出:}
|
||||
- 建议加仓的股票及理由
|
||||
- 建议减仓/卖出的股票及理由
|
||||
- 建议继续持有的股票及理由
|
||||
- 建议继续持有/观察的股票及理由
|
||||
- 如不同市场混合持仓,指出币种和市场风险差异
|
||||
|
||||
> ⚠️ 以上分析仅供参考,不构成投资建议。投资有风险,入市需谨慎。
|
||||
```
|
||||
|
||||
## 模板使用说明
|
||||
|
||||
- 所有 `{占位符}` 根据脚本返回的 JSON 数据填充
|
||||
- 操作建议图标映射:🟢🟢 强烈买入 / 🟢 买入 / 🟡 持有 / 🔴 卖出 / 🔴🔴 强烈卖出
|
||||
- 数值保留合理小数位(价格 2-3 位,百分比 2 位)
|
||||
- 若某项基本面数据为 null/缺失,显示为 "N/A"
|
||||
- 分析总结部分使用自然语言,避免机械堆砌数据
|
||||
- 所有 `{占位符}` 根据脚本返回的 JSON 数据填充。
|
||||
- 操作建议图标映射:🟢🟢 强烈买入 / 🟢 买入 / 🟡 持有 / 🔴 卖出 / 🔴🔴 强烈卖出。
|
||||
- 单股报告优先展示:`recommendation.action`、`recommendation.action_type`、`recommendation.confidence`、`recommendation.regime`、`recommendation.layer_scores`、`signal_validation`。
|
||||
- 价格和盈亏前缀不要写死为 HK$,应按币种动态展示(HKD/CNY/USD)。
|
||||
- 若某项历史验证不存在或样本不足,显示为 `样本不足`,不要伪造数值。
|
||||
- 若某项基本面数据缺失,显示为 `N/A`。
|
||||
- 分析总结部分使用自然语言,避免机械堆砌指标;要把“为什么是这个评级”说清楚。
|
||||
|
||||
@@ -15,12 +15,37 @@ import sys
|
||||
import json
|
||||
import argparse
|
||||
import time
|
||||
import hashlib
|
||||
import urllib.request
|
||||
import urllib.error
|
||||
from datetime import datetime, timedelta
|
||||
from pathlib import Path
|
||||
|
||||
try:
|
||||
from db import (
|
||||
ANALYSIS_CACHE_TTL_SECONDS,
|
||||
clear_analysis_cache,
|
||||
get_cached_analysis,
|
||||
get_kline_df,
|
||||
get_latest_kline_date,
|
||||
init_db,
|
||||
set_cached_analysis,
|
||||
upsert_kline_df,
|
||||
upsert_watchlist_item,
|
||||
)
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).resolve().parent))
|
||||
from db import (
|
||||
ANALYSIS_CACHE_TTL_SECONDS,
|
||||
clear_analysis_cache,
|
||||
get_cached_analysis,
|
||||
get_kline_df,
|
||||
get_latest_kline_date,
|
||||
init_db,
|
||||
set_cached_analysis,
|
||||
upsert_kline_df,
|
||||
upsert_watchlist_item,
|
||||
)
|
||||
|
||||
try:
|
||||
import numpy as np
|
||||
except ImportError:
|
||||
@@ -38,90 +63,98 @@ except ImportError:
|
||||
# 缓存与重试机制
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
DATA_DIR = Path.home() / ".stockbuddy"
|
||||
CACHE_DIR = DATA_DIR / "cache"
|
||||
CACHE_TTL_SECONDS = 600 # 缓存有效期 10 分钟
|
||||
LEGACY_CACHE_DIR = Path.home() / ".stock_buddy_cache"
|
||||
MAX_RETRIES = 3
|
||||
RETRY_BASE_DELAY = 2
|
||||
|
||||
|
||||
def _cache_key(code: str, period: str) -> str:
|
||||
"""生成缓存文件名"""
|
||||
key = f"{code}_{period}"
|
||||
return hashlib.md5(key.encode()).hexdigest() + ".json"
|
||||
|
||||
|
||||
def _read_cache(code: str, period: str) -> dict | None:
|
||||
"""读取缓存"""
|
||||
cache_file = CACHE_DIR / _cache_key(code, period)
|
||||
if not cache_file.exists():
|
||||
legacy_cache_file = LEGACY_CACHE_DIR / _cache_key(code, period)
|
||||
if legacy_cache_file.exists():
|
||||
try:
|
||||
DATA_DIR.mkdir(parents=True, exist_ok=True)
|
||||
CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
||||
cache_file.write_text(
|
||||
legacy_cache_file.read_text(encoding="utf-8"),
|
||||
encoding="utf-8",
|
||||
)
|
||||
except OSError:
|
||||
cache_file = legacy_cache_file
|
||||
|
||||
if not cache_file.exists():
|
||||
return None
|
||||
|
||||
try:
|
||||
with open(cache_file, "r", encoding="utf-8") as f:
|
||||
cached = json.load(f)
|
||||
cached_time = datetime.fromisoformat(cached.get("analysis_time", ""))
|
||||
if (datetime.now() - cached_time).total_seconds() < CACHE_TTL_SECONDS:
|
||||
cached["_from_cache"] = True
|
||||
return cached
|
||||
except (json.JSONDecodeError, ValueError, KeyError):
|
||||
pass
|
||||
return None
|
||||
|
||||
|
||||
def _write_cache(code: str, period: str, data: dict):
|
||||
"""写入缓存"""
|
||||
DATA_DIR.mkdir(parents=True, exist_ok=True)
|
||||
CACHE_DIR.mkdir(parents=True, exist_ok=True)
|
||||
cache_file = CACHE_DIR / _cache_key(code, period)
|
||||
try:
|
||||
with open(cache_file, "w", encoding="utf-8") as f:
|
||||
json.dump(data, f, ensure_ascii=False, indent=2, default=str)
|
||||
except OSError:
|
||||
pass
|
||||
ANALYSIS_CACHE_TTL = ANALYSIS_CACHE_TTL_SECONDS
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# 腾讯财经数据获取
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def normalize_hk_code(code: str) -> tuple[str, str]:
|
||||
"""标准化港股代码,返回 (原始数字代码, 带.HK后缀代码)"""
|
||||
code = code.strip().upper().replace(".HK", "")
|
||||
digits = code.lstrip("0")
|
||||
if digits.isdigit():
|
||||
numeric_code = code.zfill(4)
|
||||
return numeric_code, numeric_code + ".HK"
|
||||
return code, code + ".HK"
|
||||
|
||||
def normalize_stock_code(code: str) -> dict:
|
||||
"""标准化股票代码,支持港股/A股/美股。"""
|
||||
raw = code.strip().upper()
|
||||
|
||||
if raw.endswith('.HK'):
|
||||
digits = raw[:-3].lstrip('0') or '0'
|
||||
return {
|
||||
'market': 'HK',
|
||||
'code': digits.zfill(4) + '.HK',
|
||||
'tencent_symbol': 'hk' + digits.zfill(5),
|
||||
'exchange': 'HKEX',
|
||||
}
|
||||
|
||||
if raw.startswith(('SH', 'SZ')) and len(raw) == 8 and raw[2:].isdigit():
|
||||
market = raw[:2]
|
||||
return {
|
||||
'market': market,
|
||||
'code': raw,
|
||||
'tencent_symbol': raw.lower(),
|
||||
'exchange': 'SSE' if market == 'SH' else 'SZSE',
|
||||
}
|
||||
|
||||
if raw.endswith('.US'):
|
||||
symbol = raw[:-3]
|
||||
return {
|
||||
'market': 'US',
|
||||
'code': symbol,
|
||||
'tencent_symbol': 'us' + symbol,
|
||||
'exchange': 'US',
|
||||
}
|
||||
|
||||
if raw.startswith('US.'):
|
||||
symbol = raw[3:]
|
||||
return {
|
||||
'market': 'US',
|
||||
'code': symbol,
|
||||
'tencent_symbol': 'us' + symbol,
|
||||
'exchange': 'US',
|
||||
}
|
||||
|
||||
if raw.isdigit():
|
||||
if len(raw) <= 5:
|
||||
digits = raw.lstrip('0') or '0'
|
||||
return {
|
||||
'market': 'HK',
|
||||
'code': digits.zfill(4) + '.HK',
|
||||
'tencent_symbol': 'hk' + digits.zfill(5),
|
||||
'exchange': 'HKEX',
|
||||
}
|
||||
if len(raw) == 6:
|
||||
market = 'SH' if raw.startswith(('5', '6', '9')) else 'SZ'
|
||||
return {
|
||||
'market': market,
|
||||
'code': market + raw,
|
||||
'tencent_symbol': (market + raw).lower(),
|
||||
'exchange': 'SSE' if market == 'SH' else 'SZSE',
|
||||
}
|
||||
|
||||
symbol = raw.replace('.', '').replace('-', '')
|
||||
return {
|
||||
'market': 'US',
|
||||
'code': symbol,
|
||||
'tencent_symbol': 'us' + symbol,
|
||||
'exchange': 'US',
|
||||
}
|
||||
|
||||
|
||||
def fetch_tencent_quote(code: str) -> dict:
|
||||
"""获取腾讯财经实时行情"""
|
||||
numeric_code, full_code = normalize_hk_code(code)
|
||||
url = f"http://qt.gtimg.cn/q=hk{numeric_code}"
|
||||
|
||||
stock = normalize_stock_code(code)
|
||||
symbol = stock['tencent_symbol']
|
||||
url = f"http://qt.gtimg.cn/q={symbol}"
|
||||
|
||||
for attempt in range(MAX_RETRIES):
|
||||
try:
|
||||
req = urllib.request.Request(url, headers={
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
|
||||
'Referer': 'https://gu.qq.com/',
|
||||
})
|
||||
with urllib.request.urlopen(req, timeout=10) as response:
|
||||
data = response.read().decode("gb2312", errors="ignore")
|
||||
return _parse_tencent_quote(data, numeric_code)
|
||||
data = response.read().decode('gb2312', errors='ignore')
|
||||
return _parse_tencent_quote(data, symbol, stock)
|
||||
except urllib.error.URLError as e:
|
||||
if attempt < MAX_RETRIES - 1:
|
||||
time.sleep(RETRY_BASE_DELAY * (attempt + 1))
|
||||
@@ -130,9 +163,9 @@ def fetch_tencent_quote(code: str) -> dict:
|
||||
return {}
|
||||
|
||||
|
||||
def _parse_tencent_quote(data: str, code: str) -> dict:
|
||||
def _parse_tencent_quote(data: str, symbol: str, stock: dict) -> dict:
|
||||
"""解析腾讯财经实时行情响应"""
|
||||
var_name = f"v_hk{code}"
|
||||
var_name = f"v_{symbol}"
|
||||
for line in data.strip().split(";"):
|
||||
line = line.strip()
|
||||
if not line or var_name not in line:
|
||||
@@ -158,40 +191,53 @@ def _parse_tencent_quote(data: str, code: str) -> dict:
|
||||
# 0:市场 1:名称 2:代码 3:现价 4:昨收 5:今开 6:成交量
|
||||
# 30:时间戳 31:涨跌额 32:涨跌幅 33:最高 34:最低
|
||||
# 39:市盈率 47:市净率 37:总市值 48:52周高 49:52周低
|
||||
market = stock['market']
|
||||
currency = 'HKD' if market == 'HK' else ('CNY' if market in ('SH', 'SZ') else safe_str(35, 'USD') or 'USD')
|
||||
pb_idx = 47 if market in ('HK', 'US') else 46
|
||||
market_cap_idx = 37 if market == 'HK' else (57 if market in ('SH', 'SZ') else 44)
|
||||
high_52_idx = 48 if market in ('HK', 'US') else 41
|
||||
low_52_idx = 49 if market in ('HK', 'US') else 42
|
||||
return {
|
||||
"name": values[1],
|
||||
"code": values[2],
|
||||
"price": safe_float(3),
|
||||
"prev_close": safe_float(4),
|
||||
"open": safe_float(5),
|
||||
"volume": safe_float(6),
|
||||
"high": safe_float(33),
|
||||
"low": safe_float(34),
|
||||
"change_amount": safe_float(31),
|
||||
"change_pct": safe_float(32),
|
||||
"timestamp": safe_str(30),
|
||||
"pe": safe_float(39) if len(values) > 39 else None,
|
||||
"pb": safe_float(47) if len(values) > 47 else None,
|
||||
"market_cap": safe_str(37),
|
||||
"52w_high": safe_float(48) if len(values) > 48 else None,
|
||||
"52w_low": safe_float(49) if len(values) > 49 else None,
|
||||
'name': values[1],
|
||||
'code': stock['code'],
|
||||
'market': market,
|
||||
'exchange': stock.get('exchange'),
|
||||
'tencent_symbol': symbol,
|
||||
'price': safe_float(3),
|
||||
'prev_close': safe_float(4),
|
||||
'open': safe_float(5),
|
||||
'volume': safe_float(6),
|
||||
'high': safe_float(33),
|
||||
'low': safe_float(34),
|
||||
'change_amount': safe_float(31),
|
||||
'change_pct': safe_float(32),
|
||||
'timestamp': safe_str(30),
|
||||
'currency': currency,
|
||||
'pe': safe_float(39) if len(values) > 39 else None,
|
||||
'pb': safe_float(pb_idx) if len(values) > pb_idx else None,
|
||||
'market_cap': safe_str(market_cap_idx),
|
||||
'52w_high': safe_float(high_52_idx) if len(values) > high_52_idx else None,
|
||||
'52w_low': safe_float(low_52_idx) if len(values) > low_52_idx else None,
|
||||
'raw_code': safe_str(2),
|
||||
}
|
||||
return {}
|
||||
|
||||
|
||||
def fetch_tencent_kline(code: str, days: int = 120) -> pd.DataFrame:
|
||||
"""获取腾讯财经K线数据"""
|
||||
numeric_code, full_code = normalize_hk_code(code)
|
||||
url = f"https://web.ifzq.gtimg.cn/appstock/app/fqkline/get?param=hk{numeric_code},day,,,{days},qfq"
|
||||
|
||||
stock = normalize_stock_code(code)
|
||||
symbol = stock['tencent_symbol']
|
||||
url = f"https://web.ifzq.gtimg.cn/appstock/app/fqkline/get?param={symbol},day,,,{days},qfq"
|
||||
|
||||
for attempt in range(MAX_RETRIES):
|
||||
try:
|
||||
req = urllib.request.Request(url, headers={
|
||||
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
|
||||
'Referer': 'https://gu.qq.com/',
|
||||
})
|
||||
with urllib.request.urlopen(req, timeout=15) as response:
|
||||
data = json.loads(response.read().decode("utf-8"))
|
||||
return _parse_tencent_kline(data, numeric_code)
|
||||
data = json.loads(response.read().decode('utf-8'))
|
||||
return _parse_tencent_kline(data, symbol)
|
||||
except (urllib.error.URLError, json.JSONDecodeError) as e:
|
||||
if attempt < MAX_RETRIES - 1:
|
||||
time.sleep(RETRY_BASE_DELAY * (attempt + 1))
|
||||
@@ -200,33 +246,78 @@ def fetch_tencent_kline(code: str, days: int = 120) -> pd.DataFrame:
|
||||
return pd.DataFrame()
|
||||
|
||||
|
||||
def _parse_tencent_kline(data: dict, code: str) -> pd.DataFrame:
|
||||
def _parse_tencent_kline(data: dict, symbol: str) -> pd.DataFrame:
|
||||
"""解析腾讯财经K线数据"""
|
||||
key = f"hk{code}"
|
||||
if data.get("code") != 0 or not data.get("data") or key not in data["data"]:
|
||||
if data.get('code') != 0 or not data.get('data') or symbol not in data['data']:
|
||||
return pd.DataFrame()
|
||||
|
||||
day_data = data["data"][key].get("day", [])
|
||||
|
||||
symbol_data = data['data'][symbol]
|
||||
day_data = symbol_data.get('day') or symbol_data.get('qfqday') or symbol_data.get('hfqday') or []
|
||||
if not day_data:
|
||||
return pd.DataFrame()
|
||||
|
||||
# 格式: [日期, 开盘价, 收盘价, 最低价, 最高价, 成交量]
|
||||
|
||||
records = []
|
||||
for item in day_data:
|
||||
if len(item) >= 6:
|
||||
records.append({
|
||||
"Date": item[0],
|
||||
"Open": float(item[1]),
|
||||
"Close": float(item[2]),
|
||||
"Low": float(item[3]),
|
||||
"High": float(item[4]),
|
||||
"Volume": float(item[5]),
|
||||
'Date': item[0],
|
||||
'Open': float(item[1]),
|
||||
'Close': float(item[2]),
|
||||
'Low': float(item[3]),
|
||||
'High': float(item[4]),
|
||||
'Volume': float(item[5]),
|
||||
})
|
||||
|
||||
|
||||
df = pd.DataFrame(records)
|
||||
if not df.empty:
|
||||
df["Date"] = pd.to_datetime(df["Date"])
|
||||
df.set_index("Date", inplace=True)
|
||||
df['Date'] = pd.to_datetime(df['Date'])
|
||||
df.set_index('Date', inplace=True)
|
||||
return df
|
||||
|
||||
|
||||
def fetch_us_kline_yahoo(symbol: str, period: str = '6mo') -> pd.DataFrame:
|
||||
range_map = {
|
||||
'1mo': '1mo',
|
||||
'3mo': '3mo',
|
||||
'6mo': '6mo',
|
||||
'1y': '1y',
|
||||
'2y': '2y',
|
||||
'5y': '5y',
|
||||
}
|
||||
url = f"https://query1.finance.yahoo.com/v8/finance/chart/{symbol}?range={range_map.get(period, '6mo')}&interval=1d&includePrePost=false"
|
||||
req = urllib.request.Request(url, headers={'User-Agent': 'Mozilla/5.0'})
|
||||
with urllib.request.urlopen(req, timeout=20) as response:
|
||||
data = json.loads(response.read().decode('utf-8'))
|
||||
|
||||
result = data.get('chart', {}).get('result', [])
|
||||
if not result:
|
||||
return pd.DataFrame()
|
||||
result = result[0]
|
||||
timestamps = result.get('timestamp') or []
|
||||
quote = (result.get('indicators', {}).get('quote') or [{}])[0]
|
||||
opens = quote.get('open') or []
|
||||
highs = quote.get('high') or []
|
||||
lows = quote.get('low') or []
|
||||
closes = quote.get('close') or []
|
||||
volumes = quote.get('volume') or []
|
||||
|
||||
records = []
|
||||
for i, ts in enumerate(timestamps):
|
||||
if i >= len(opens) or opens[i] is None or closes[i] is None or highs[i] is None or lows[i] is None:
|
||||
continue
|
||||
records.append({
|
||||
'Date': datetime.fromtimestamp(ts).strftime('%Y-%m-%d'),
|
||||
'Open': float(opens[i]),
|
||||
'Close': float(closes[i]),
|
||||
'Low': float(lows[i]),
|
||||
'High': float(highs[i]),
|
||||
'Volume': float(volumes[i] or 0),
|
||||
})
|
||||
|
||||
df = pd.DataFrame(records)
|
||||
if not df.empty:
|
||||
df['Date'] = pd.to_datetime(df['Date'])
|
||||
df.set_index('Date', inplace=True)
|
||||
return df
|
||||
|
||||
|
||||
@@ -243,6 +334,42 @@ def period_to_days(period: str) -> int:
|
||||
return mapping.get(period, 180)
|
||||
|
||||
|
||||
def min_kline_points(required_days: int) -> int:
|
||||
return 20 if required_days <= 30 else 30
|
||||
|
||||
|
||||
def refresh_kline_cache(code: str, required_days: int, period: str = '6mo') -> pd.DataFrame:
|
||||
"""使用 SQLite 保存日线数据,并按需增量刷新。"""
|
||||
stock = normalize_stock_code(code)
|
||||
buffer_days = 30
|
||||
latest_date = get_latest_kline_date(code)
|
||||
fetch_days = max(required_days + buffer_days, 60)
|
||||
|
||||
if latest_date:
|
||||
latest_dt = datetime.strptime(latest_date, "%Y-%m-%d")
|
||||
missing_days = max((datetime.now() - latest_dt).days, 0)
|
||||
if missing_days <= 2:
|
||||
fetch_days = min(fetch_days, 60)
|
||||
else:
|
||||
fetch_days = max(missing_days + buffer_days, 60)
|
||||
|
||||
fetched = fetch_tencent_kline(code, fetch_days)
|
||||
if stock['market'] == 'US' and len(fetched) <= 2:
|
||||
fetched = fetch_us_kline_yahoo(stock['code'], period)
|
||||
if not fetched.empty:
|
||||
upsert_kline_df(code, fetched, source='yahoo' if stock['market'] == 'US' and len(fetched) > 2 else 'tencent')
|
||||
|
||||
hist = get_kline_df(code, required_days + buffer_days)
|
||||
if len(hist) < min_kline_points(required_days):
|
||||
fallback = fetch_tencent_kline(code, required_days + buffer_days)
|
||||
if stock['market'] == 'US' and len(fallback) <= 2:
|
||||
fallback = fetch_us_kline_yahoo(stock['code'], period)
|
||||
if not fallback.empty:
|
||||
upsert_kline_df(code, fallback, source='yahoo' if stock['market'] == 'US' and len(fallback) > 2 else 'tencent')
|
||||
hist = get_kline_df(code, required_days + buffer_days)
|
||||
return hist
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# 技术指标计算 (保持不变)
|
||||
# ─────────────────────────────────────────────
|
||||
@@ -454,30 +581,21 @@ def calc_ma_trend(close: pd.Series) -> dict:
|
||||
def get_fundamentals(quote: dict) -> dict:
|
||||
"""基于实时行情数据的基本面分析"""
|
||||
result = {}
|
||||
|
||||
# 估值指标 (腾讯提供的)
|
||||
pe = quote.get("pe")
|
||||
pb = quote.get("pb")
|
||||
result["PE"] = round(pe, 2) if pe else None
|
||||
result["PB"] = round(pb, 2) if pb else None
|
||||
result["PS"] = None # 腾讯不提供
|
||||
|
||||
# 市值
|
||||
result["market_cap"] = quote.get("market_cap", "")
|
||||
|
||||
# 52周价格区间
|
||||
result["52w_high"] = quote.get("52w_high")
|
||||
result["52w_low"] = quote.get("52w_low")
|
||||
|
||||
# 公司信息
|
||||
result["company_name"] = quote.get("name", "未知")
|
||||
result["sector"] = "港股"
|
||||
result["industry"] = "港股"
|
||||
result["currency"] = "HKD"
|
||||
|
||||
# 基本面信号
|
||||
result["fundamental_signal"] = _fundamental_signal(result)
|
||||
|
||||
|
||||
pe = quote.get('pe')
|
||||
pb = quote.get('pb')
|
||||
result['PE'] = round(pe, 2) if pe else None
|
||||
result['PB'] = round(pb, 2) if pb else None
|
||||
result['PS'] = None
|
||||
result['market_cap'] = quote.get('market_cap', '')
|
||||
result['52w_high'] = quote.get('52w_high')
|
||||
result['52w_low'] = quote.get('52w_low')
|
||||
result['company_name'] = quote.get('name', '未知')
|
||||
result['sector'] = quote.get('market', '未知市场')
|
||||
result['industry'] = quote.get('exchange') or quote.get('market', '未知')
|
||||
result['currency'] = quote.get('currency', 'N/A')
|
||||
result['market'] = quote.get('market', 'N/A')
|
||||
result['fundamental_signal'] = _fundamental_signal(result)
|
||||
return result
|
||||
|
||||
|
||||
@@ -523,125 +641,229 @@ def _fundamental_signal(data: dict) -> str:
|
||||
# 综合评分与建议
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def generate_recommendation(technical: dict, fundamental: dict, current_price: float) -> dict:
|
||||
"""综合技术面和基本面给出操作建议"""
|
||||
score = 0
|
||||
signals = []
|
||||
MARKET_PROFILES = {
|
||||
"HK": {"technical": 0.62, "fundamental": 0.38, "risk_penalty": 1.0},
|
||||
"SH": {"technical": 0.58, "fundamental": 0.42, "risk_penalty": 0.9},
|
||||
"SZ": {"technical": 0.60, "fundamental": 0.40, "risk_penalty": 1.0},
|
||||
"US": {"technical": 0.55, "fundamental": 0.45, "risk_penalty": 0.85},
|
||||
}
|
||||
|
||||
# 技术面评分
|
||||
|
||||
def clamp(value: float, low: float, high: float) -> float:
|
||||
return max(low, min(high, value))
|
||||
|
||||
|
||||
def detect_market_regime(hist: pd.DataFrame, technical: dict, quote: dict) -> dict:
|
||||
close = hist["Close"]
|
||||
ma20 = close.rolling(20).mean().iloc[-1] if len(close) >= 20 else close.iloc[-1]
|
||||
ma60 = close.rolling(60).mean().iloc[-1] if len(close) >= 60 else ma20
|
||||
current = close.iloc[-1]
|
||||
rsi12 = technical.get("rsi", {}).get("RSI12", technical.get("rsi", {}).get("RSI6", 50))
|
||||
high_52w = quote.get("52w_high")
|
||||
low_52w = quote.get("52w_low")
|
||||
pos_52w = None
|
||||
if high_52w and low_52w and high_52w != low_52w:
|
||||
pos_52w = (current - low_52w) / (high_52w - low_52w)
|
||||
|
||||
if current > ma20 > ma60 and rsi12 >= 55:
|
||||
regime = "趋势延续"
|
||||
elif rsi12 <= 35 and technical.get("kdj", {}).get("J", 50) < 20:
|
||||
regime = "超跌反弹"
|
||||
elif pos_52w is not None and pos_52w > 0.85 and rsi12 >= 68:
|
||||
regime = "高位风险"
|
||||
elif abs(current / ma20 - 1) < 0.03 and 40 <= rsi12 <= 60:
|
||||
regime = "区间震荡"
|
||||
else:
|
||||
regime = "估值修复/等待确认"
|
||||
|
||||
return {"regime": regime, "position_52w": round(pos_52w, 4) if pos_52w is not None else None}
|
||||
|
||||
|
||||
def compute_layer_scores(hist: pd.DataFrame, technical: dict, fundamental: dict, quote: dict) -> dict:
|
||||
close = hist["Close"]
|
||||
current = close.iloc[-1]
|
||||
ret_5 = (current / close.iloc[-6] - 1) if len(close) > 5 else 0
|
||||
ret_20 = (current / close.iloc[-21] - 1) if len(close) > 20 else ret_5
|
||||
ma = technical.get("ma_trend", {})
|
||||
above = ma.get("price_above_ma_count", "0/1").split("/")
|
||||
above_ratio = (int(above[0]) / max(int(above[1]), 1)) if len(above) == 2 else 0
|
||||
macd_sig = technical.get("macd", {}).get("signal", "")
|
||||
if "买入" in macd_sig or "金叉" in macd_sig:
|
||||
score += 2
|
||||
signals.append(f"MACD: {macd_sig}")
|
||||
elif "卖出" in macd_sig or "死叉" in macd_sig:
|
||||
score -= 2
|
||||
signals.append(f"MACD: {macd_sig}")
|
||||
elif "多头" in macd_sig:
|
||||
score += 1
|
||||
signals.append(f"MACD: {macd_sig}")
|
||||
elif "空头" in macd_sig:
|
||||
score -= 1
|
||||
signals.append(f"MACD: {macd_sig}")
|
||||
|
||||
rsi_sig = technical.get("rsi", {}).get("signal", "")
|
||||
if "超卖" in rsi_sig:
|
||||
score += 2
|
||||
signals.append(f"RSI: {rsi_sig}")
|
||||
elif "超买" in rsi_sig:
|
||||
score -= 2
|
||||
signals.append(f"RSI: {rsi_sig}")
|
||||
|
||||
kdj_sig = technical.get("kdj", {}).get("signal", "")
|
||||
if "买入" in kdj_sig or "金叉" in kdj_sig:
|
||||
score += 1
|
||||
signals.append(f"KDJ: {kdj_sig}")
|
||||
elif "卖出" in kdj_sig or "死叉" in kdj_sig:
|
||||
score -= 1
|
||||
signals.append(f"KDJ: {kdj_sig}")
|
||||
|
||||
rsi = technical.get("rsi", {}).get("RSI12", technical.get("rsi", {}).get("RSI6", 50))
|
||||
kdj_j = technical.get("kdj", {}).get("J", 50)
|
||||
volume_ratio = technical.get("volume", {}).get("volume_ratio", 1)
|
||||
boll_sig = technical.get("bollinger", {}).get("signal", "")
|
||||
if "超卖" in boll_sig or "下轨" in boll_sig:
|
||||
score += 1
|
||||
signals.append(f"布林带: {boll_sig}")
|
||||
elif "超买" in boll_sig or "上轨" in boll_sig:
|
||||
score -= 1
|
||||
signals.append(f"布林带: {boll_sig}")
|
||||
|
||||
ma_sig = technical.get("ma_trend", {}).get("trend_signal", "")
|
||||
if "多头" in ma_sig or "强势" in ma_sig:
|
||||
score += 2
|
||||
signals.append(f"均线: {ma_sig}")
|
||||
elif "空头" in ma_sig or "弱势" in ma_sig:
|
||||
score -= 2
|
||||
signals.append(f"均线: {ma_sig}")
|
||||
elif "偏多" in ma_sig:
|
||||
score += 1
|
||||
elif "偏空" in ma_sig:
|
||||
score -= 1
|
||||
|
||||
vol_sig = technical.get("volume", {}).get("signal", "")
|
||||
if "放量上涨" in vol_sig:
|
||||
score += 1
|
||||
signals.append(f"成交量: {vol_sig}")
|
||||
elif "放量下跌" in vol_sig:
|
||||
score -= 1
|
||||
signals.append(f"成交量: {vol_sig}")
|
||||
|
||||
# 基本面评分
|
||||
fund_sig = fundamental.get("fundamental_signal", "")
|
||||
if "优秀" in fund_sig:
|
||||
score += 2
|
||||
signals.append(f"基本面: {fund_sig}")
|
||||
elif "良好" in fund_sig:
|
||||
score += 1
|
||||
signals.append(f"基本面: {fund_sig}")
|
||||
elif "较差" in fund_sig:
|
||||
score -= 2
|
||||
signals.append(f"基本面: {fund_sig}")
|
||||
|
||||
# 52周位置
|
||||
pe = fundamental.get("PE")
|
||||
pb = fundamental.get("PB")
|
||||
high_52w = fundamental.get("52w_high")
|
||||
low_52w = fundamental.get("52w_low")
|
||||
pos_52w = 0.5
|
||||
if high_52w and low_52w and high_52w != low_52w:
|
||||
position = (current_price - low_52w) / (high_52w - low_52w)
|
||||
if position < 0.2:
|
||||
score += 1
|
||||
signals.append(f"52周位置: {position:.0%} (接近低点)")
|
||||
elif position > 0.9:
|
||||
score -= 1
|
||||
signals.append(f"52周位置: {position:.0%} (接近高点)")
|
||||
else:
|
||||
signals.append(f"52周位置: {position:.0%}")
|
||||
pos_52w = clamp((quote.get("price", current) - low_52w) / (high_52w - low_52w), 0, 1)
|
||||
|
||||
# 映射到操作建议
|
||||
if score >= 5:
|
||||
action = "强烈买入"
|
||||
action_en = "STRONG_BUY"
|
||||
color = "🟢🟢"
|
||||
elif score >= 2:
|
||||
action = "买入"
|
||||
action_en = "BUY"
|
||||
color = "🟢"
|
||||
elif score >= -1:
|
||||
action = "持有/观望"
|
||||
action_en = "HOLD"
|
||||
color = "🟡"
|
||||
elif score >= -4:
|
||||
action = "卖出"
|
||||
action_en = "SELL"
|
||||
color = "🔴"
|
||||
trend = (ret_20 * 100 * 0.6) + (above_ratio - 0.5) * 8
|
||||
if "多头" in macd_sig or "金叉" in macd_sig:
|
||||
trend += 1.5
|
||||
elif "空头" in macd_sig or "死叉" in macd_sig:
|
||||
trend -= 1.5
|
||||
|
||||
momentum = ret_5 * 100 * 0.8
|
||||
momentum += 1.2 if volume_ratio > 1.5 and ret_5 > 0 else 0
|
||||
momentum -= 1.2 if volume_ratio > 1.5 and ret_5 < 0 else 0
|
||||
momentum += 0.8 if "金叉" in technical.get("kdj", {}).get("signal", "") else 0
|
||||
momentum -= 0.8 if "死叉" in technical.get("kdj", {}).get("signal", "") else 0
|
||||
|
||||
risk = 0.0
|
||||
if rsi > 75:
|
||||
risk -= 2.2
|
||||
elif rsi < 28:
|
||||
risk += 1.0
|
||||
if kdj_j > 100:
|
||||
risk -= 1.2
|
||||
elif kdj_j < 0:
|
||||
risk += 0.8
|
||||
if pos_52w > 0.88:
|
||||
risk -= 1.2
|
||||
elif pos_52w < 0.18:
|
||||
risk += 0.8
|
||||
if "突破上轨" in boll_sig:
|
||||
risk -= 0.8
|
||||
elif "突破下轨" in boll_sig:
|
||||
risk += 0.6
|
||||
|
||||
valuation = 0.0
|
||||
if pe is not None:
|
||||
if 0 < pe < 15:
|
||||
valuation += 2.0
|
||||
elif pe < 25:
|
||||
valuation += 1.0
|
||||
elif pe > 40:
|
||||
valuation -= 1.5
|
||||
if pb is not None:
|
||||
if 0 < pb < 1:
|
||||
valuation += 1.0
|
||||
elif pb > 6:
|
||||
valuation -= 1.0
|
||||
|
||||
relative_strength = clamp(ret_20 * 100 / 4, -3, 3)
|
||||
volume_structure = clamp((volume_ratio - 1.0) * 2, -2.5, 2.5)
|
||||
|
||||
return {
|
||||
"trend": round(clamp(trend, -5, 5), 2),
|
||||
"momentum": round(clamp(momentum, -5, 5), 2),
|
||||
"risk": round(clamp(risk, -5, 5), 2),
|
||||
"valuation": round(clamp(valuation, -5, 5), 2),
|
||||
"relative_strength": round(relative_strength, 2),
|
||||
"volume_structure": round(volume_structure, 2),
|
||||
}
|
||||
|
||||
|
||||
def evaluate_signal_quality(layer_scores: dict) -> dict:
|
||||
positives = sum(1 for v in layer_scores.values() if v > 0.8)
|
||||
negatives = sum(1 for v in layer_scores.values() if v < -0.8)
|
||||
dispersion = max(layer_scores.values()) - min(layer_scores.values())
|
||||
agreement = abs(positives - negatives)
|
||||
confidence = 40 + agreement * 8 - min(dispersion * 2.5, 18)
|
||||
confidence = int(clamp(confidence, 18, 92))
|
||||
if confidence >= 72:
|
||||
level = "高"
|
||||
elif confidence >= 55:
|
||||
level = "中"
|
||||
else:
|
||||
action = "强烈卖出"
|
||||
action_en = "STRONG_SELL"
|
||||
color = "🔴🔴"
|
||||
level = "低"
|
||||
return {"score": confidence, "level": level, "positives": positives, "negatives": negatives}
|
||||
|
||||
|
||||
def backtest_current_signal(hist: pd.DataFrame, period: str) -> dict:
|
||||
horizons = [5, 10, 20]
|
||||
closes = hist["Close"].reset_index(drop=True)
|
||||
if len(closes) < 45:
|
||||
return {"samples": 0, "message": "历史样本不足"}
|
||||
current_ret20 = (closes.iloc[-1] / closes.iloc[-21] - 1) if len(closes) > 20 else 0
|
||||
current_ret5 = (closes.iloc[-1] / closes.iloc[-6] - 1) if len(closes) > 5 else 0
|
||||
matched = []
|
||||
for i in range(25, len(closes) - 20):
|
||||
r20 = closes.iloc[i] / closes.iloc[i-20] - 1
|
||||
r5 = closes.iloc[i] / closes.iloc[i-5] - 1
|
||||
if abs(r20 - current_ret20) < 0.06 and abs(r5 - current_ret5) < 0.04:
|
||||
matched.append(i)
|
||||
if len(matched) < 5:
|
||||
return {"samples": len(matched), "message": "相似信号样本不足"}
|
||||
|
||||
perf = {"samples": len(matched)}
|
||||
all_forward = []
|
||||
for h in horizons:
|
||||
vals = []
|
||||
for i in matched:
|
||||
if i + h < len(closes):
|
||||
vals.append(closes.iloc[i + h] / closes.iloc[i] - 1)
|
||||
if vals:
|
||||
perf[f"forward_{h}d_avg_pct"] = round(sum(vals) / len(vals) * 100, 2)
|
||||
perf[f"forward_{h}d_win_rate"] = round(sum(1 for x in vals if x > 0) / len(vals) * 100, 2)
|
||||
all_forward.extend(vals)
|
||||
if all_forward:
|
||||
perf["max_drawdown_proxy_pct"] = round(min(all_forward) * 100, 2)
|
||||
perf["period"] = period
|
||||
return perf
|
||||
|
||||
|
||||
def decide_action_type(regime: str, total_score: float, confidence: dict) -> tuple[str, str]:
|
||||
if total_score >= 4.5 and confidence["score"] >= 70:
|
||||
return "强烈买入", "趋势型买入" if regime == "趋势延续" else "高置信度买入"
|
||||
if total_score >= 2:
|
||||
if regime == "超跌反弹":
|
||||
return "买入", "超跌博弈型买入"
|
||||
return "买入", "趋势跟随型买入"
|
||||
if total_score <= -4.5 and confidence["score"] >= 70:
|
||||
return "强烈卖出", "风险规避型卖出"
|
||||
if total_score <= -2:
|
||||
return "卖出", "止盈/止损型卖出"
|
||||
return "持有/观望", "等待确认"
|
||||
|
||||
|
||||
def generate_recommendation(technical: dict, fundamental: dict, current_price: float, hist: pd.DataFrame, quote: dict) -> dict:
|
||||
market = quote.get("market", "HK")
|
||||
profile = MARKET_PROFILES.get(market, MARKET_PROFILES["HK"])
|
||||
regime = detect_market_regime(hist, technical, quote)
|
||||
layer_scores = compute_layer_scores(hist, technical, fundamental, quote)
|
||||
confidence = evaluate_signal_quality(layer_scores)
|
||||
|
||||
technical_bucket = (
|
||||
layer_scores["trend"] * 0.35 +
|
||||
layer_scores["momentum"] * 0.25 +
|
||||
layer_scores["relative_strength"] * 0.20 +
|
||||
layer_scores["volume_structure"] * 0.20
|
||||
)
|
||||
fundamental_bucket = layer_scores["valuation"]
|
||||
risk_bucket = layer_scores["risk"] * profile["risk_penalty"]
|
||||
total_score = technical_bucket * profile["technical"] + fundamental_bucket * profile["fundamental"] + risk_bucket
|
||||
total_score = round(clamp(total_score, -8, 8), 2)
|
||||
|
||||
action, action_type = decide_action_type(regime["regime"], total_score, confidence)
|
||||
icon_map = {"强烈买入": "🟢🟢", "买入": "🟢", "持有/观望": "🟡", "卖出": "🔴", "强烈卖出": "🔴🔴"}
|
||||
en_map = {"强烈买入": "STRONG_BUY", "买入": "BUY", "持有/观望": "HOLD", "卖出": "SELL", "强烈卖出": "STRONG_SELL"}
|
||||
icon = icon_map[action]
|
||||
|
||||
key_signals = [
|
||||
f"市场场景: {regime['regime']}",
|
||||
f"趋势层: {layer_scores['trend']}",
|
||||
f"动量层: {layer_scores['momentum']}",
|
||||
f"风险层: {layer_scores['risk']}",
|
||||
f"估值层: {layer_scores['valuation']}",
|
||||
f"置信度: {confidence['level']}({confidence['score']})",
|
||||
]
|
||||
|
||||
return {
|
||||
"action": action,
|
||||
"action_en": action_en,
|
||||
"score": score,
|
||||
"icon": color,
|
||||
"key_signals": signals,
|
||||
"summary": f"{color} {action} (综合评分: {score})",
|
||||
"action_en": en_map[action],
|
||||
"action_type": action_type,
|
||||
"score": total_score,
|
||||
"icon": icon,
|
||||
"market_profile": market,
|
||||
"regime": regime,
|
||||
"layer_scores": layer_scores,
|
||||
"confidence": confidence,
|
||||
"key_signals": key_signals,
|
||||
"summary": f"{icon} {action} / {action_type} (综合评分: {total_score})",
|
||||
}
|
||||
|
||||
|
||||
@@ -650,65 +872,77 @@ def generate_recommendation(technical: dict, fundamental: dict, current_price: f
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def analyze_stock(code: str, period: str = "6mo", use_cache: bool = True) -> dict:
|
||||
"""对单只港股进行完整分析"""
|
||||
numeric_code, full_code = normalize_hk_code(code)
|
||||
|
||||
"""对单只股票进行完整分析"""
|
||||
init_db()
|
||||
stock = normalize_stock_code(code)
|
||||
full_code = stock['code']
|
||||
|
||||
if use_cache:
|
||||
cached = _read_cache(full_code, period)
|
||||
cached = get_cached_analysis(full_code, period)
|
||||
if cached:
|
||||
print(f"📦 使用缓存数据 ({full_code}),缓存有效期 {CACHE_TTL_SECONDS}s", file=sys.stderr)
|
||||
print(f"📦 使用缓存数据 ({full_code}),缓存有效期 {ANALYSIS_CACHE_TTL}s", file=sys.stderr)
|
||||
return cached
|
||||
|
||||
result = {"code": full_code, "analysis_time": datetime.now().isoformat(), "error": None}
|
||||
result = {"code": full_code, "market": stock['market'], "analysis_time": datetime.now().isoformat(), "error": None}
|
||||
|
||||
try:
|
||||
# 1. 获取实时行情
|
||||
quote = fetch_tencent_quote(numeric_code)
|
||||
quote = fetch_tencent_quote(full_code)
|
||||
if not quote or not quote.get("price"):
|
||||
result["error"] = f"无法获取 {full_code} 的实时行情"
|
||||
return result
|
||||
|
||||
|
||||
upsert_watchlist_item(
|
||||
code=full_code,
|
||||
market=quote.get('market', stock['market']),
|
||||
tencent_symbol=quote.get('tencent_symbol', stock['tencent_symbol']),
|
||||
name=quote.get('name'),
|
||||
exchange=quote.get('exchange', stock.get('exchange')),
|
||||
currency=quote.get('currency'),
|
||||
last_price=quote.get('price'),
|
||||
pe=quote.get('pe'),
|
||||
pb=quote.get('pb'),
|
||||
market_cap=quote.get('market_cap'),
|
||||
week52_high=quote.get('52w_high'),
|
||||
week52_low=quote.get('52w_low'),
|
||||
quote_time=quote.get('timestamp'),
|
||||
meta=quote,
|
||||
)
|
||||
|
||||
current_price = quote["price"]
|
||||
result["current_price"] = current_price
|
||||
result["price_date"] = quote.get("timestamp", "")
|
||||
result["price_change"] = quote.get("change_amount")
|
||||
result["price_change_pct"] = quote.get("change_pct")
|
||||
|
||||
# 2. 获取K线数据
|
||||
days = period_to_days(period)
|
||||
hist = fetch_tencent_kline(numeric_code, days)
|
||||
|
||||
if hist.empty or len(hist) < 30:
|
||||
hist = refresh_kline_cache(full_code, days, period)
|
||||
if hist.empty or len(hist) < min_kline_points(days):
|
||||
result["error"] = f"无法获取 {full_code} 的历史K线数据 (仅获得 {len(hist)} 条)"
|
||||
return result
|
||||
|
||||
result["data_points"] = len(hist)
|
||||
|
||||
result["data_points"] = len(hist)
|
||||
close = hist["Close"]
|
||||
high = hist["High"]
|
||||
low = hist["Low"]
|
||||
volume = hist["Volume"]
|
||||
|
||||
# 3. 技术分析
|
||||
technical = {}
|
||||
technical["ma_trend"] = calc_ma_trend(close)
|
||||
technical["macd"] = calc_macd(close)
|
||||
technical["rsi"] = calc_rsi(close)
|
||||
technical["kdj"] = calc_kdj(high, low, close)
|
||||
technical["bollinger"] = calc_bollinger(close)
|
||||
technical["volume"] = calc_volume_analysis(volume, close)
|
||||
technical = {
|
||||
"ma_trend": calc_ma_trend(close),
|
||||
"macd": calc_macd(close),
|
||||
"rsi": calc_rsi(close),
|
||||
"kdj": calc_kdj(high, low, close),
|
||||
"bollinger": calc_bollinger(close),
|
||||
"volume": calc_volume_analysis(volume, close),
|
||||
}
|
||||
result["technical"] = technical
|
||||
|
||||
# 4. 基本面分析
|
||||
fundamental = get_fundamentals(quote)
|
||||
result["fundamental"] = fundamental
|
||||
result["recommendation"] = generate_recommendation(technical, fundamental, current_price, hist, quote)
|
||||
result["signal_validation"] = backtest_current_signal(hist, period)
|
||||
|
||||
# 5. 综合建议
|
||||
result["recommendation"] = generate_recommendation(technical, fundamental, current_price)
|
||||
|
||||
# 6. 写入缓存
|
||||
if result.get("error") is None:
|
||||
_write_cache(full_code, period, result)
|
||||
set_cached_analysis(full_code, period, result)
|
||||
|
||||
except Exception as e:
|
||||
result["error"] = f"分析过程出错: {str(e)}"
|
||||
@@ -717,8 +951,8 @@ def analyze_stock(code: str, period: str = "6mo", use_cache: bool = True) -> dic
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="港股分析工具 (腾讯财经数据源)")
|
||||
parser.add_argument("code", help="港股代码 (如 0700.HK, 00700, 腾讯)")
|
||||
parser = argparse.ArgumentParser(description="多市场股票分析工具 (腾讯财经/Yahoo 数据源)")
|
||||
parser.add_argument("code", help="股票代码,如 0700.HK / 600519 / SH600519 / AAPL")
|
||||
parser.add_argument("--period", default="6mo", help="数据周期 (1mo/3mo/6mo/1y/2y/5y)")
|
||||
parser.add_argument("--output", help="输出JSON文件路径")
|
||||
parser.add_argument("--no-cache", action="store_true", help="跳过缓存,强制重新请求数据")
|
||||
@@ -726,22 +960,15 @@ def main():
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.clear_cache:
|
||||
import shutil
|
||||
cleared = False
|
||||
for path in (CACHE_DIR, LEGACY_CACHE_DIR):
|
||||
if path.exists():
|
||||
shutil.rmtree(path)
|
||||
cleared = True
|
||||
cleared = clear_analysis_cache()
|
||||
if cleared:
|
||||
print("✅ 缓存已清除")
|
||||
print(f"✅ 已清除 {cleared} 条分析缓存")
|
||||
else:
|
||||
print("ℹ️ 无缓存可清除")
|
||||
return
|
||||
|
||||
result = analyze_stock(args.code, args.period, use_cache=not args.no_cache)
|
||||
|
||||
output = json.dumps(result, ensure_ascii=False, indent=2, default=str)
|
||||
|
||||
if args.output:
|
||||
with open(args.output, "w", encoding="utf-8") as f:
|
||||
f.write(output)
|
||||
|
||||
531
scripts/db.py
Normal file
531
scripts/db.py
Normal file
@@ -0,0 +1,531 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
StockBuddy SQLite 数据层
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import sqlite3
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
import pandas as pd
|
||||
|
||||
DATA_DIR = Path.home() / ".stockbuddy"
|
||||
DB_PATH = DATA_DIR / "stockbuddy.db"
|
||||
ANALYSIS_CACHE_TTL_SECONDS = 600
|
||||
ANALYSIS_CACHE_MAX_ROWS = 1000
|
||||
|
||||
|
||||
def _utc_now_iso() -> str:
|
||||
return datetime.utcnow().replace(microsecond=0).isoformat()
|
||||
|
||||
|
||||
def ensure_data_dir() -> None:
|
||||
DATA_DIR.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
|
||||
def get_connection() -> sqlite3.Connection:
|
||||
ensure_data_dir()
|
||||
conn = sqlite3.connect(DB_PATH)
|
||||
conn.row_factory = sqlite3.Row
|
||||
conn.execute("PRAGMA journal_mode=WAL")
|
||||
conn.execute("PRAGMA foreign_keys=ON")
|
||||
conn.execute("PRAGMA synchronous=NORMAL")
|
||||
return conn
|
||||
|
||||
|
||||
def _table_columns(conn: sqlite3.Connection, table: str) -> list[str]:
|
||||
try:
|
||||
rows = conn.execute(f"PRAGMA table_info({table})").fetchall()
|
||||
return [row[1] for row in rows]
|
||||
except sqlite3.OperationalError:
|
||||
return []
|
||||
|
||||
|
||||
def _migrate_schema(conn: sqlite3.Connection) -> None:
|
||||
positions_cols = _table_columns(conn, "positions")
|
||||
if positions_cols and "watchlist_id" not in positions_cols:
|
||||
conn.execute("DROP TABLE positions")
|
||||
|
||||
|
||||
def init_db() -> None:
|
||||
with get_connection() as conn:
|
||||
_migrate_schema(conn)
|
||||
conn.executescript(
|
||||
"""
|
||||
CREATE TABLE IF NOT EXISTS watchlist (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
code TEXT NOT NULL UNIQUE,
|
||||
market TEXT NOT NULL,
|
||||
tencent_symbol TEXT NOT NULL,
|
||||
name TEXT,
|
||||
exchange TEXT,
|
||||
currency TEXT,
|
||||
last_price REAL,
|
||||
pe REAL,
|
||||
pb REAL,
|
||||
market_cap TEXT,
|
||||
week52_high REAL,
|
||||
week52_low REAL,
|
||||
quote_time TEXT,
|
||||
is_watched INTEGER NOT NULL DEFAULT 0,
|
||||
meta_json TEXT,
|
||||
created_at TEXT NOT NULL,
|
||||
updated_at TEXT NOT NULL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_watchlist_market ON watchlist (market, code);
|
||||
CREATE INDEX IF NOT EXISTS idx_watchlist_is_watched ON watchlist (is_watched, updated_at DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS kline_daily (
|
||||
code TEXT NOT NULL,
|
||||
trade_date TEXT NOT NULL,
|
||||
open REAL NOT NULL,
|
||||
high REAL NOT NULL,
|
||||
low REAL NOT NULL,
|
||||
close REAL NOT NULL,
|
||||
volume REAL NOT NULL,
|
||||
adj_type TEXT NOT NULL DEFAULT 'qfq',
|
||||
source TEXT NOT NULL DEFAULT 'tencent',
|
||||
updated_at TEXT NOT NULL,
|
||||
PRIMARY KEY (code, trade_date, adj_type)
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_kline_daily_code_date
|
||||
ON kline_daily (code, trade_date DESC);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS positions (
|
||||
watchlist_id INTEGER PRIMARY KEY,
|
||||
buy_price REAL NOT NULL,
|
||||
shares INTEGER NOT NULL,
|
||||
buy_date TEXT,
|
||||
note TEXT DEFAULT '',
|
||||
created_at TEXT NOT NULL,
|
||||
updated_at TEXT NOT NULL,
|
||||
FOREIGN KEY (watchlist_id) REFERENCES watchlist(id) ON DELETE CASCADE
|
||||
);
|
||||
|
||||
CREATE TABLE IF NOT EXISTS analysis_cache (
|
||||
cache_key TEXT PRIMARY KEY,
|
||||
code TEXT NOT NULL,
|
||||
period TEXT NOT NULL,
|
||||
result_json TEXT NOT NULL,
|
||||
expires_at TEXT NOT NULL,
|
||||
created_at TEXT NOT NULL
|
||||
);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_analysis_cache_expires_at
|
||||
ON analysis_cache (expires_at);
|
||||
|
||||
CREATE INDEX IF NOT EXISTS idx_analysis_cache_code_period
|
||||
ON analysis_cache (code, period, created_at DESC);
|
||||
"""
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
|
||||
def cleanup_analysis_cache(conn: sqlite3.Connection | None = None) -> None:
|
||||
own_conn = conn is None
|
||||
conn = conn or get_connection()
|
||||
try:
|
||||
now = _utc_now_iso()
|
||||
conn.execute("DELETE FROM analysis_cache WHERE expires_at <= ?", (now,))
|
||||
overflow = conn.execute(
|
||||
"SELECT COUNT(*) AS cnt FROM analysis_cache"
|
||||
).fetchone()["cnt"] - ANALYSIS_CACHE_MAX_ROWS
|
||||
if overflow > 0:
|
||||
conn.execute(
|
||||
"""
|
||||
DELETE FROM analysis_cache
|
||||
WHERE cache_key IN (
|
||||
SELECT cache_key
|
||||
FROM analysis_cache
|
||||
ORDER BY created_at ASC
|
||||
LIMIT ?
|
||||
)
|
||||
""",
|
||||
(overflow,),
|
||||
)
|
||||
conn.commit()
|
||||
finally:
|
||||
if own_conn:
|
||||
conn.close()
|
||||
|
||||
|
||||
def clear_analysis_cache() -> int:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
count = conn.execute("SELECT COUNT(*) AS cnt FROM analysis_cache").fetchone()["cnt"]
|
||||
conn.execute("DELETE FROM analysis_cache")
|
||||
conn.commit()
|
||||
return count
|
||||
|
||||
|
||||
def get_cached_analysis(code: str, period: str) -> dict | None:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
cleanup_analysis_cache(conn)
|
||||
cache_key = f"{code}:{period}"
|
||||
row = conn.execute(
|
||||
"""
|
||||
SELECT result_json
|
||||
FROM analysis_cache
|
||||
WHERE cache_key = ? AND expires_at > ?
|
||||
""",
|
||||
(cache_key, _utc_now_iso()),
|
||||
).fetchone()
|
||||
if not row:
|
||||
return None
|
||||
result = json.loads(row["result_json"])
|
||||
result["_from_cache"] = True
|
||||
return result
|
||||
|
||||
|
||||
def set_cached_analysis(code: str, period: str, result: dict) -> None:
|
||||
init_db()
|
||||
now = _utc_now_iso()
|
||||
expires_at = datetime.utcfromtimestamp(
|
||||
datetime.utcnow().timestamp() + ANALYSIS_CACHE_TTL_SECONDS
|
||||
).replace(microsecond=0).isoformat()
|
||||
cache_key = f"{code}:{period}"
|
||||
with get_connection() as conn:
|
||||
cleanup_analysis_cache(conn)
|
||||
conn.execute(
|
||||
"""
|
||||
INSERT INTO analysis_cache (cache_key, code, period, result_json, expires_at, created_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?)
|
||||
ON CONFLICT(cache_key) DO UPDATE SET
|
||||
result_json = excluded.result_json,
|
||||
expires_at = excluded.expires_at,
|
||||
created_at = excluded.created_at
|
||||
""",
|
||||
(cache_key, code, period, json.dumps(result, ensure_ascii=False), expires_at, now),
|
||||
)
|
||||
conn.commit()
|
||||
|
||||
|
||||
def upsert_watchlist_item(
|
||||
*,
|
||||
code: str,
|
||||
market: str,
|
||||
tencent_symbol: str,
|
||||
name: str | None = None,
|
||||
exchange: str | None = None,
|
||||
currency: str | None = None,
|
||||
last_price: float | None = None,
|
||||
pe: float | None = None,
|
||||
pb: float | None = None,
|
||||
market_cap: str | None = None,
|
||||
week52_high: float | None = None,
|
||||
week52_low: float | None = None,
|
||||
quote_time: str | None = None,
|
||||
is_watched: bool | None = None,
|
||||
meta: dict | None = None,
|
||||
) -> dict:
|
||||
init_db()
|
||||
now = _utc_now_iso()
|
||||
with get_connection() as conn:
|
||||
existing = conn.execute("SELECT * FROM watchlist WHERE code = ?", (code,)).fetchone()
|
||||
created_at = existing["created_at"] if existing else now
|
||||
current_is_watched = existing["is_watched"] if existing else 0
|
||||
conn.execute(
|
||||
"""
|
||||
INSERT INTO watchlist (
|
||||
code, market, tencent_symbol, name, exchange, currency, last_price,
|
||||
pe, pb, market_cap, week52_high, week52_low, quote_time, is_watched,
|
||||
meta_json, created_at, updated_at
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
ON CONFLICT(code) DO UPDATE SET
|
||||
market = excluded.market,
|
||||
tencent_symbol = excluded.tencent_symbol,
|
||||
name = COALESCE(excluded.name, watchlist.name),
|
||||
exchange = COALESCE(excluded.exchange, watchlist.exchange),
|
||||
currency = COALESCE(excluded.currency, watchlist.currency),
|
||||
last_price = COALESCE(excluded.last_price, watchlist.last_price),
|
||||
pe = COALESCE(excluded.pe, watchlist.pe),
|
||||
pb = COALESCE(excluded.pb, watchlist.pb),
|
||||
market_cap = COALESCE(excluded.market_cap, watchlist.market_cap),
|
||||
week52_high = COALESCE(excluded.week52_high, watchlist.week52_high),
|
||||
week52_low = COALESCE(excluded.week52_low, watchlist.week52_low),
|
||||
quote_time = COALESCE(excluded.quote_time, watchlist.quote_time),
|
||||
is_watched = excluded.is_watched,
|
||||
meta_json = COALESCE(excluded.meta_json, watchlist.meta_json),
|
||||
updated_at = excluded.updated_at
|
||||
""",
|
||||
(
|
||||
code,
|
||||
market,
|
||||
tencent_symbol,
|
||||
name,
|
||||
exchange,
|
||||
currency,
|
||||
last_price,
|
||||
pe,
|
||||
pb,
|
||||
market_cap,
|
||||
week52_high,
|
||||
week52_low,
|
||||
quote_time,
|
||||
int(current_is_watched if is_watched is None else is_watched),
|
||||
json.dumps(meta, ensure_ascii=False) if meta else None,
|
||||
created_at,
|
||||
now,
|
||||
),
|
||||
)
|
||||
conn.commit()
|
||||
row = conn.execute("SELECT * FROM watchlist WHERE code = ?", (code,)).fetchone()
|
||||
return dict(row)
|
||||
|
||||
|
||||
def get_watchlist_item(code: str) -> dict | None:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
row = conn.execute("SELECT * FROM watchlist WHERE code = ?", (code,)).fetchone()
|
||||
return dict(row) if row else None
|
||||
|
||||
|
||||
def list_watchlist(only_watched: bool = False) -> list[dict]:
|
||||
init_db()
|
||||
sql = "SELECT * FROM watchlist"
|
||||
params = ()
|
||||
if only_watched:
|
||||
sql += " WHERE is_watched = 1"
|
||||
sql += " ORDER BY updated_at DESC, code ASC"
|
||||
with get_connection() as conn:
|
||||
rows = conn.execute(sql, params).fetchall()
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
|
||||
def set_watch_status(code: str, watched: bool) -> dict | None:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
row = conn.execute("SELECT * FROM watchlist WHERE code = ?", (code,)).fetchone()
|
||||
if not row:
|
||||
return None
|
||||
conn.execute(
|
||||
"UPDATE watchlist SET is_watched = ?, updated_at = ? WHERE code = ?",
|
||||
(int(watched), _utc_now_iso(), code),
|
||||
)
|
||||
conn.commit()
|
||||
row = conn.execute("SELECT * FROM watchlist WHERE code = ?", (code,)).fetchone()
|
||||
return dict(row) if row else None
|
||||
|
||||
|
||||
def get_latest_kline_date(code: str, adj_type: str = "qfq") -> str | None:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
row = conn.execute(
|
||||
"SELECT MAX(trade_date) AS latest_date FROM kline_daily WHERE code = ? AND adj_type = ?",
|
||||
(code, adj_type),
|
||||
).fetchone()
|
||||
return row["latest_date"] if row and row["latest_date"] else None
|
||||
|
||||
|
||||
def upsert_kline_df(code: str, df, adj_type: str = "qfq", source: str = "tencent") -> int:
|
||||
import pandas as pd
|
||||
|
||||
if df.empty:
|
||||
return 0
|
||||
init_db()
|
||||
now = _utc_now_iso()
|
||||
records = []
|
||||
for idx, row in df.sort_index().iterrows():
|
||||
trade_date = pd.Timestamp(idx).strftime("%Y-%m-%d")
|
||||
records.append(
|
||||
(
|
||||
code,
|
||||
trade_date,
|
||||
float(row["Open"]),
|
||||
float(row["High"]),
|
||||
float(row["Low"]),
|
||||
float(row["Close"]),
|
||||
float(row["Volume"]),
|
||||
adj_type,
|
||||
source,
|
||||
now,
|
||||
)
|
||||
)
|
||||
with get_connection() as conn:
|
||||
conn.executemany(
|
||||
"""
|
||||
INSERT INTO kline_daily (
|
||||
code, trade_date, open, high, low, close, volume, adj_type, source, updated_at
|
||||
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
ON CONFLICT(code, trade_date, adj_type) DO UPDATE SET
|
||||
open = excluded.open,
|
||||
high = excluded.high,
|
||||
low = excluded.low,
|
||||
close = excluded.close,
|
||||
volume = excluded.volume,
|
||||
source = excluded.source,
|
||||
updated_at = excluded.updated_at
|
||||
""",
|
||||
records,
|
||||
)
|
||||
conn.commit()
|
||||
return len(records)
|
||||
|
||||
|
||||
def get_kline_df(code: str, limit: int, adj_type: str = "qfq"):
|
||||
import pandas as pd
|
||||
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
rows = conn.execute(
|
||||
"""
|
||||
SELECT trade_date, open, high, low, close, volume
|
||||
FROM kline_daily
|
||||
WHERE code = ? AND adj_type = ?
|
||||
ORDER BY trade_date DESC
|
||||
LIMIT ?
|
||||
""",
|
||||
(code, adj_type, limit),
|
||||
).fetchall()
|
||||
if not rows:
|
||||
return pd.DataFrame()
|
||||
records = [
|
||||
{
|
||||
"Date": row["trade_date"],
|
||||
"Open": row["open"],
|
||||
"High": row["high"],
|
||||
"Low": row["low"],
|
||||
"Close": row["close"],
|
||||
"Volume": row["volume"],
|
||||
}
|
||||
for row in reversed(rows)
|
||||
]
|
||||
df = pd.DataFrame(records)
|
||||
df["Date"] = pd.to_datetime(df["Date"])
|
||||
df.set_index("Date", inplace=True)
|
||||
return df
|
||||
|
||||
|
||||
def list_positions() -> list[dict]:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
rows = conn.execute(
|
||||
"""
|
||||
SELECT
|
||||
p.watchlist_id,
|
||||
w.code,
|
||||
w.market,
|
||||
w.name,
|
||||
w.currency,
|
||||
p.buy_price,
|
||||
p.shares,
|
||||
p.buy_date,
|
||||
p.note,
|
||||
p.created_at AS added_at,
|
||||
p.updated_at
|
||||
FROM positions p
|
||||
JOIN watchlist w ON w.id = p.watchlist_id
|
||||
ORDER BY w.code ASC
|
||||
"""
|
||||
).fetchall()
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
|
||||
def get_position(code: str) -> dict | None:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
row = conn.execute(
|
||||
"""
|
||||
SELECT
|
||||
p.watchlist_id,
|
||||
w.code,
|
||||
w.market,
|
||||
w.name,
|
||||
w.currency,
|
||||
p.buy_price,
|
||||
p.shares,
|
||||
p.buy_date,
|
||||
p.note,
|
||||
p.created_at AS added_at,
|
||||
p.updated_at
|
||||
FROM positions p
|
||||
JOIN watchlist w ON w.id = p.watchlist_id
|
||||
WHERE w.code = ?
|
||||
""",
|
||||
(code,),
|
||||
).fetchone()
|
||||
return dict(row) if row else None
|
||||
|
||||
|
||||
def upsert_position(
|
||||
*,
|
||||
code: str,
|
||||
market: str,
|
||||
tencent_symbol: str,
|
||||
buy_price: float,
|
||||
shares: int,
|
||||
buy_date: str | None,
|
||||
note: str = "",
|
||||
name: str | None = None,
|
||||
currency: str | None = None,
|
||||
meta: dict | None = None,
|
||||
) -> dict:
|
||||
init_db()
|
||||
watch = upsert_watchlist_item(
|
||||
code=code,
|
||||
market=market,
|
||||
tencent_symbol=tencent_symbol,
|
||||
name=name,
|
||||
currency=currency,
|
||||
is_watched=True,
|
||||
meta=meta,
|
||||
)
|
||||
now = _utc_now_iso()
|
||||
with get_connection() as conn:
|
||||
existing = conn.execute(
|
||||
"SELECT created_at FROM positions WHERE watchlist_id = ?", (watch["id"],)
|
||||
).fetchone()
|
||||
created_at = existing["created_at"] if existing else now
|
||||
conn.execute(
|
||||
"""
|
||||
INSERT INTO positions (watchlist_id, buy_price, shares, buy_date, note, created_at, updated_at)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?)
|
||||
ON CONFLICT(watchlist_id) DO UPDATE SET
|
||||
buy_price = excluded.buy_price,
|
||||
shares = excluded.shares,
|
||||
buy_date = excluded.buy_date,
|
||||
note = excluded.note,
|
||||
updated_at = excluded.updated_at
|
||||
""",
|
||||
(watch["id"], buy_price, shares, buy_date, note, created_at, now),
|
||||
)
|
||||
conn.commit()
|
||||
return get_position(code)
|
||||
|
||||
|
||||
def remove_position(code: str) -> bool:
|
||||
init_db()
|
||||
with get_connection() as conn:
|
||||
row = conn.execute("SELECT id FROM watchlist WHERE code = ?", (code,)).fetchone()
|
||||
if not row:
|
||||
return False
|
||||
cur = conn.execute("DELETE FROM positions WHERE watchlist_id = ?", (row["id"],))
|
||||
conn.commit()
|
||||
return cur.rowcount > 0
|
||||
|
||||
|
||||
def update_position_fields(code: str, price: float | None = None, shares: int | None = None, note: str | None = None) -> dict | None:
|
||||
current = get_position(code)
|
||||
if not current:
|
||||
return None
|
||||
watch = get_watchlist_item(code)
|
||||
return upsert_position(
|
||||
code=code,
|
||||
market=watch["market"],
|
||||
tencent_symbol=watch["tencent_symbol"],
|
||||
buy_price=price if price is not None else current["buy_price"],
|
||||
shares=shares if shares is not None else current["shares"],
|
||||
buy_date=current.get("buy_date"),
|
||||
note=note if note is not None else current.get("note", ""),
|
||||
name=watch.get("name"),
|
||||
currency=watch.get("currency"),
|
||||
meta=json.loads(watch["meta_json"]) if watch.get("meta_json") else None,
|
||||
)
|
||||
@@ -1,6 +1,6 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
港股持仓管理工具 - 管理持仓列表并批量分析。
|
||||
多市场股票持仓/关注池管理工具 - 基于 SQLite 管理关注股票、持仓并批量分析。
|
||||
|
||||
用法:
|
||||
python3 portfolio_manager.py list
|
||||
@@ -8,8 +8,11 @@
|
||||
python3 portfolio_manager.py remove <代码>
|
||||
python3 portfolio_manager.py update <代码> [--price <价格>] [--shares <数量>] [--note <备注>]
|
||||
python3 portfolio_manager.py analyze [--output <输出文件>]
|
||||
python3 portfolio_manager.py watch-list
|
||||
python3 portfolio_manager.py watch-add <代码>
|
||||
python3 portfolio_manager.py watch-remove <代码>
|
||||
|
||||
持仓文件默认保存在: ~/.stockbuddy/portfolio.json
|
||||
数据默认保存在: ~/.stockbuddy/stockbuddy.db
|
||||
"""
|
||||
|
||||
import sys
|
||||
@@ -18,136 +21,166 @@ import argparse
|
||||
import os
|
||||
import time
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
DATA_DIR = Path.home() / ".stockbuddy"
|
||||
PORTFOLIO_PATH = DATA_DIR / "portfolio.json"
|
||||
LEGACY_PORTFOLIO_PATH = Path.home() / ".hk_stock_portfolio.json"
|
||||
|
||||
|
||||
def load_portfolio() -> dict:
|
||||
"""加载持仓数据"""
|
||||
if not PORTFOLIO_PATH.exists() and LEGACY_PORTFOLIO_PATH.exists():
|
||||
DATA_DIR.mkdir(parents=True, exist_ok=True)
|
||||
PORTFOLIO_PATH.write_text(LEGACY_PORTFOLIO_PATH.read_text(encoding="utf-8"), encoding="utf-8")
|
||||
|
||||
if not PORTFOLIO_PATH.exists():
|
||||
return {"positions": [], "updated_at": None}
|
||||
with open(PORTFOLIO_PATH, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
|
||||
|
||||
def save_portfolio(data: dict):
|
||||
"""保存持仓数据"""
|
||||
data["updated_at"] = datetime.now().isoformat()
|
||||
DATA_DIR.mkdir(parents=True, exist_ok=True)
|
||||
with open(PORTFOLIO_PATH, "w", encoding="utf-8") as f:
|
||||
json.dump(data, f, ensure_ascii=False, indent=2)
|
||||
try:
|
||||
from db import (
|
||||
DB_PATH,
|
||||
get_watchlist_item,
|
||||
init_db,
|
||||
list_positions as db_list_positions,
|
||||
list_watchlist as db_list_watchlist,
|
||||
remove_position as db_remove_position,
|
||||
set_watch_status,
|
||||
update_position_fields,
|
||||
upsert_position,
|
||||
upsert_watchlist_item,
|
||||
)
|
||||
from analyze_stock import fetch_tencent_quote, normalize_stock_code, analyze_stock
|
||||
except ImportError:
|
||||
script_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
sys.path.insert(0, script_dir)
|
||||
from db import (
|
||||
DB_PATH,
|
||||
get_watchlist_item,
|
||||
init_db,
|
||||
list_positions as db_list_positions,
|
||||
list_watchlist as db_list_watchlist,
|
||||
remove_position as db_remove_position,
|
||||
set_watch_status,
|
||||
update_position_fields,
|
||||
upsert_position,
|
||||
upsert_watchlist_item,
|
||||
)
|
||||
from analyze_stock import fetch_tencent_quote, normalize_stock_code, analyze_stock
|
||||
|
||||
|
||||
def normalize_code(code: str) -> str:
|
||||
"""标准化港股代码"""
|
||||
code = code.strip().upper()
|
||||
if not code.endswith(".HK"):
|
||||
digits = code.lstrip("0")
|
||||
if digits.isdigit():
|
||||
code = code.zfill(4) + ".HK"
|
||||
return code
|
||||
return normalize_stock_code(code)["code"]
|
||||
|
||||
|
||||
def ensure_watch_item(code: str, watched: bool = False) -> dict:
|
||||
stock = normalize_stock_code(code)
|
||||
quote = fetch_tencent_quote(stock["code"])
|
||||
name = quote.get("name") if quote else None
|
||||
return upsert_watchlist_item(
|
||||
code=stock["code"],
|
||||
market=stock["market"],
|
||||
tencent_symbol=stock["tencent_symbol"],
|
||||
name=name,
|
||||
exchange=quote.get("exchange", stock.get("exchange")) if quote else stock.get("exchange"),
|
||||
currency=quote.get("currency") if quote else None,
|
||||
last_price=quote.get("price") if quote else None,
|
||||
pe=quote.get("pe") if quote else None,
|
||||
pb=quote.get("pb") if quote else None,
|
||||
market_cap=quote.get("market_cap") if quote else None,
|
||||
week52_high=quote.get("52w_high") if quote else None,
|
||||
week52_low=quote.get("52w_low") if quote else None,
|
||||
quote_time=quote.get("timestamp") if quote else None,
|
||||
is_watched=watched,
|
||||
meta=quote or stock,
|
||||
)
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# 持仓管理
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def list_positions():
|
||||
"""列出所有持仓"""
|
||||
portfolio = load_portfolio()
|
||||
positions = portfolio.get("positions", [])
|
||||
init_db()
|
||||
positions = db_list_positions()
|
||||
if not positions:
|
||||
print(json.dumps({"message": "持仓为空", "positions": []}, ensure_ascii=False, indent=2))
|
||||
return
|
||||
print(json.dumps({
|
||||
"total_positions": len(positions),
|
||||
"positions": positions,
|
||||
"portfolio_file": str(PORTFOLIO_PATH),
|
||||
"updated_at": portfolio.get("updated_at"),
|
||||
"portfolio_db": str(DB_PATH),
|
||||
"updated_at": datetime.now().isoformat(),
|
||||
}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
def add_position(code: str, price: float, shares: int, date: str = None, note: str = ""):
|
||||
"""添加持仓"""
|
||||
code = normalize_code(code)
|
||||
portfolio = load_portfolio()
|
||||
positions = portfolio.get("positions", [])
|
||||
init_db()
|
||||
normalized = normalize_stock_code(code)
|
||||
existing = next((p for p in db_list_positions() if p["code"] == normalized["code"]), None)
|
||||
if existing:
|
||||
print(json.dumps({"error": f"{normalized['code']} 已在持仓中,请使用 update 命令更新"}, ensure_ascii=False))
|
||||
return
|
||||
|
||||
# 检查是否已存在
|
||||
for pos in positions:
|
||||
if pos["code"] == code:
|
||||
print(json.dumps({"error": f"{code} 已在持仓中,请使用 update 命令更新"}, ensure_ascii=False))
|
||||
return
|
||||
|
||||
position = {
|
||||
"code": code,
|
||||
"buy_price": price,
|
||||
"shares": shares,
|
||||
"buy_date": date or datetime.now().strftime("%Y-%m-%d"),
|
||||
"note": note,
|
||||
"added_at": datetime.now().isoformat(),
|
||||
}
|
||||
positions.append(position)
|
||||
portfolio["positions"] = positions
|
||||
save_portfolio(portfolio)
|
||||
print(json.dumps({"message": f"已添加 {code}", "position": position}, ensure_ascii=False, indent=2))
|
||||
watch = ensure_watch_item(normalized["code"], watched=True)
|
||||
position = upsert_position(
|
||||
code=normalized["code"],
|
||||
market=normalized["market"],
|
||||
tencent_symbol=normalized["tencent_symbol"],
|
||||
buy_price=price,
|
||||
shares=shares,
|
||||
buy_date=date or datetime.now().strftime("%Y-%m-%d"),
|
||||
note=note,
|
||||
name=watch.get("name"),
|
||||
currency=watch.get("currency"),
|
||||
meta=json.loads(watch["meta_json"]) if watch.get("meta_json") else None,
|
||||
)
|
||||
print(json.dumps({"message": f"已添加 {normalized['code']}", "position": position}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
def remove_position(code: str):
|
||||
"""移除持仓"""
|
||||
code = normalize_code(code)
|
||||
portfolio = load_portfolio()
|
||||
positions = portfolio.get("positions", [])
|
||||
new_positions = [p for p in positions if p["code"] != code]
|
||||
if len(new_positions) == len(positions):
|
||||
print(json.dumps({"error": f"{code} 不在持仓中"}, ensure_ascii=False))
|
||||
init_db()
|
||||
normalized_code = normalize_code(code)
|
||||
removed = db_remove_position(normalized_code)
|
||||
if not removed:
|
||||
print(json.dumps({"error": f"{normalized_code} 不在持仓中"}, ensure_ascii=False))
|
||||
return
|
||||
portfolio["positions"] = new_positions
|
||||
save_portfolio(portfolio)
|
||||
print(json.dumps({"message": f"已移除 {code}"}, ensure_ascii=False, indent=2))
|
||||
print(json.dumps({"message": f"已移除 {normalized_code}"}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
def update_position(code: str, price: float = None, shares: int = None, note: str = None):
|
||||
"""更新持仓信息"""
|
||||
code = normalize_code(code)
|
||||
portfolio = load_portfolio()
|
||||
positions = portfolio.get("positions", [])
|
||||
found = False
|
||||
for pos in positions:
|
||||
if pos["code"] == code:
|
||||
if price is not None:
|
||||
pos["buy_price"] = price
|
||||
if shares is not None:
|
||||
pos["shares"] = shares
|
||||
if note is not None:
|
||||
pos["note"] = note
|
||||
pos["updated_at"] = datetime.now().isoformat()
|
||||
found = True
|
||||
print(json.dumps({"message": f"已更新 {code}", "position": pos}, ensure_ascii=False, indent=2))
|
||||
break
|
||||
if not found:
|
||||
print(json.dumps({"error": f"{code} 不在持仓中"}, ensure_ascii=False))
|
||||
init_db()
|
||||
normalized_code = normalize_code(code)
|
||||
position = update_position_fields(normalized_code, price=price, shares=shares, note=note)
|
||||
if not position:
|
||||
print(json.dumps({"error": f"{normalized_code} 不在持仓中"}, ensure_ascii=False))
|
||||
return
|
||||
portfolio["positions"] = positions
|
||||
save_portfolio(portfolio)
|
||||
print(json.dumps({"message": f"已更新 {normalized_code}", "position": position}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# 关注池管理
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def list_watchlist():
|
||||
init_db()
|
||||
items = db_list_watchlist(only_watched=True)
|
||||
print(json.dumps({
|
||||
"total_watchlist": len(items),
|
||||
"watchlist": items,
|
||||
"portfolio_db": str(DB_PATH),
|
||||
"updated_at": datetime.now().isoformat(),
|
||||
}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
def add_watch(code: str):
|
||||
init_db()
|
||||
watch = ensure_watch_item(code, watched=True)
|
||||
print(json.dumps({"message": f"已关注 {watch['code']}", "watch": watch}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
def remove_watch(code: str):
|
||||
init_db()
|
||||
normalized_code = normalize_code(code)
|
||||
watch = set_watch_status(normalized_code, False)
|
||||
if not watch:
|
||||
print(json.dumps({"error": f"{normalized_code} 不在关注池中"}, ensure_ascii=False))
|
||||
return
|
||||
print(json.dumps({"message": f"已取消关注 {normalized_code}", "watch": watch}, ensure_ascii=False, indent=2))
|
||||
|
||||
|
||||
# ─────────────────────────────────────────────
|
||||
# 批量分析
|
||||
# ─────────────────────────────────────────────
|
||||
|
||||
def analyze_portfolio(output_file: str = None):
|
||||
"""批量分析所有持仓"""
|
||||
# 延迟导入,避免未安装yfinance时也能管理持仓
|
||||
try:
|
||||
from analyze_stock import analyze_stock
|
||||
except ImportError:
|
||||
# 尝试从同目录导入
|
||||
script_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
sys.path.insert(0, script_dir)
|
||||
from analyze_stock import analyze_stock
|
||||
|
||||
portfolio = load_portfolio()
|
||||
positions = portfolio.get("positions", [])
|
||||
init_db()
|
||||
positions = db_list_positions()
|
||||
if not positions:
|
||||
print(json.dumps({"message": "持仓为空,无法分析"}, ensure_ascii=False, indent=2))
|
||||
return
|
||||
@@ -158,7 +191,6 @@ def analyze_portfolio(output_file: str = None):
|
||||
print(f"正在分析 {code} ({i+1}/{len(positions)})...", file=sys.stderr)
|
||||
analysis = analyze_stock(code)
|
||||
|
||||
# 计算盈亏
|
||||
if analysis.get("current_price") and pos.get("buy_price"):
|
||||
current = analysis["current_price"]
|
||||
buy = pos["buy_price"]
|
||||
@@ -175,15 +207,15 @@ def analyze_portfolio(output_file: str = None):
|
||||
"pnl": round(pnl, 2),
|
||||
"pnl_pct": round(pnl_pct, 2),
|
||||
"note": pos.get("note", ""),
|
||||
"currency": pos.get("currency"),
|
||||
"market": pos.get("market"),
|
||||
}
|
||||
|
||||
results.append(analysis)
|
||||
|
||||
# 批量请求间隔:避免连续请求触发限频(最后一只不需要等待)
|
||||
if i < len(positions) - 1 and not analysis.get("_from_cache"):
|
||||
time.sleep(2)
|
||||
|
||||
# 汇总
|
||||
total_cost = sum(r.get("portfolio_info", {}).get("cost", 0) for r in results)
|
||||
total_value = sum(r.get("portfolio_info", {}).get("market_value", 0) for r in results)
|
||||
total_pnl = total_value - total_cost
|
||||
@@ -209,13 +241,11 @@ def analyze_portfolio(output_file: str = None):
|
||||
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description="港股持仓管理工具")
|
||||
parser = argparse.ArgumentParser(description="多市场股票持仓/关注池管理工具")
|
||||
subparsers = parser.add_subparsers(dest="command", help="子命令")
|
||||
|
||||
# list
|
||||
subparsers.add_parser("list", help="列出所有持仓")
|
||||
|
||||
# add
|
||||
add_parser = subparsers.add_parser("add", help="添加持仓")
|
||||
add_parser.add_argument("code", help="股票代码")
|
||||
add_parser.add_argument("--price", type=float, required=True, help="买入价格")
|
||||
@@ -223,21 +253,24 @@ def main():
|
||||
add_parser.add_argument("--date", help="买入日期 (YYYY-MM-DD)")
|
||||
add_parser.add_argument("--note", default="", help="备注")
|
||||
|
||||
# remove
|
||||
rm_parser = subparsers.add_parser("remove", help="移除持仓")
|
||||
rm_parser.add_argument("code", help="股票代码")
|
||||
|
||||
# update
|
||||
up_parser = subparsers.add_parser("update", help="更新持仓")
|
||||
up_parser.add_argument("code", help="股票代码")
|
||||
up_parser.add_argument("--price", type=float, help="买入价格")
|
||||
up_parser.add_argument("--shares", type=int, help="持有数量")
|
||||
up_parser.add_argument("--note", help="备注")
|
||||
|
||||
# analyze
|
||||
analyze_parser = subparsers.add_parser("analyze", help="批量分析持仓")
|
||||
analyze_parser.add_argument("--output", help="输出JSON文件")
|
||||
|
||||
watch_list_parser = subparsers.add_parser("watch-list", help="列出关注池")
|
||||
watch_add_parser = subparsers.add_parser("watch-add", help="添加关注股票")
|
||||
watch_add_parser.add_argument("code", help="股票代码")
|
||||
watch_remove_parser = subparsers.add_parser("watch-remove", help="取消关注股票")
|
||||
watch_remove_parser.add_argument("code", help="股票代码")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.command == "list":
|
||||
@@ -250,6 +283,12 @@ def main():
|
||||
update_position(args.code, args.price, args.shares, args.note)
|
||||
elif args.command == "analyze":
|
||||
analyze_portfolio(args.output)
|
||||
elif args.command == "watch-list":
|
||||
list_watchlist()
|
||||
elif args.command == "watch-add":
|
||||
add_watch(args.code)
|
||||
elif args.command == "watch-remove":
|
||||
remove_watch(args.code)
|
||||
else:
|
||||
parser.print_help()
|
||||
|
||||
|
||||
Reference in New Issue
Block a user