Compare commits
16 Commits
8ebaaf8b36
...
main
Author | SHA1 | Date | |
---|---|---|---|
cdce16c29e | |||
8238ca5352 | |||
35577770dd | |||
1c974595ef | |||
6c444962ac | |||
ba9d81f453 | |||
b9767e4cfc | |||
61c9aba952 | |||
4012ff7bea | |||
537a62988a | |||
9628883fa7 | |||
673bc9cd6f | |||
2aaab6673f | |||
c392fbc366 | |||
1aaa2d70a0 | |||
7fec7ec740 |
@@ -1,4 +1,4 @@
|
|||||||
FROM node:24-bookworm-slim AS builder
|
FROM node:24.5.0-bookworm-slim AS builder
|
||||||
|
|
||||||
WORKDIR /usr/src/build
|
WORKDIR /usr/src/build
|
||||||
|
|
||||||
@@ -12,7 +12,7 @@ COPY static/css/input.css ./static/css/input.css
|
|||||||
|
|
||||||
RUN npx tailwindcss -i ./static/css/input.css -o ./static/css/style.css --minify
|
RUN npx tailwindcss -i ./static/css/input.css -o ./static/css/style.css --minify
|
||||||
|
|
||||||
FROM python:3.13.5-slim
|
FROM python:3.13.6-slim
|
||||||
|
|
||||||
EXPOSE 5000
|
EXPOSE 5000
|
||||||
|
|
||||||
|
516
README.md
516
README.md
@@ -1,336 +1,308 @@
|
|||||||
# rstat - Reddit Stock Analyzer
|
<div align="center">
|
||||||
|
|
||||||
A powerful, installable command-line tool and web dashboard to scan Reddit for stock ticker mentions, perform sentiment analysis, generate insightful reports, and create shareable summary images.
|
# RSTAT — Reddit Stock Analyzer
|
||||||
|
|
||||||
## Key Features
|
Scan Reddit for stock ticker mentions, score sentiment, enrich with price/market cap, and explore the results in a clean web dashboard. Automate shareable images and post them to Reddit.
|
||||||
|
|
||||||
* **Dual-Interface:** Use a flexible command-line tool (`rstat`) for data collection and a simple web dashboard (`rstat-dashboard`) for data visualization.
|
</div>
|
||||||
* **Flexible Data Scraping:**
|
|
||||||
* Scan subreddits from a config file or target a single subreddit on the fly.
|
|
||||||
* Configure the time window to scan posts from the last 24 hours (for daily cron jobs) or back-fill data from several past days (e.g., last 7 days).
|
|
||||||
* Fetches from `/new` to capture the most recent discussions.
|
|
||||||
* **Deep Analysis & Storage:**
|
|
||||||
* Scans both post titles and comments, differentiating between the two.
|
|
||||||
* Performs a "deep dive" analysis on posts to calculate the average sentiment of the entire comment section.
|
|
||||||
* Persists all data in a local SQLite database (`reddit_stocks.db`) to track trends over time.
|
|
||||||
* **Rich Data Enrichment:**
|
|
||||||
* Calculates sentiment (Bullish, Bearish, Neutral) for every mention using NLTK.
|
|
||||||
* Fetches and stores daily closing prices and market capitalization from Yahoo Finance.
|
|
||||||
* **Interactive Web Dashboard:**
|
|
||||||
* View Top 10 tickers across all subreddits or on a per-subreddit basis.
|
|
||||||
* Click any ticker to get a "Deep Dive" page, showing every post it was mentioned in.
|
|
||||||
* **Shareable Summary Images:**
|
|
||||||
* Generate clean, dark-mode summary images for both daily and weekly sentiment for any subreddit, perfect for sharing.
|
|
||||||
* **High-Quality Data:**
|
|
||||||
* Uses a configurable blacklist and smart filtering to reduce false positives.
|
|
||||||
* Automatically cleans the database of invalid tickers if the blacklist is updated.
|
|
||||||
|
|
||||||
## Project Structure
|
## Highlights
|
||||||
|
|
||||||
|
- CLI + Web UI: Collect data with `rstat`, browse it with `rstat-dashboard`.
|
||||||
|
- Smart ticker parsing: Prefer $TSLA/$AAPL “golden” matches; fall back to filtered ALL-CAPS words.
|
||||||
|
- Sentiment: VADER (NLTK) scores for titles and comments; “deep dive” averages per post.
|
||||||
|
- Storage: Local SQLite database `reddit_stocks.db` with de-duped mentions and post analytics.
|
||||||
|
- Enrichment: Yahoo Finance market cap + latest close fetched in batch and on-demand.
|
||||||
|
- Images: Export polished daily/weekly summary PNGs for subreddits or “overall”.
|
||||||
|
- Automation: Optional cron job plus one-command posting to Reddit with OAuth refresh tokens.
|
||||||
|
|
||||||
|
## Repository layout
|
||||||
|
|
||||||
```
|
```
|
||||||
reddit_stock_analyzer/
|
.
|
||||||
├── .env # Your secret API keys
|
├── Dockerfile # Multi-stage build (Tailwind -> Python + gunicorn)
|
||||||
├── requirements.txt # Project dependencies
|
├── docker-compose.yml # Prod (nginx + varnish optional) + dashboard
|
||||||
├── setup.py # Installation script for the tool
|
├── docker-compose-dev.yml # Dev compose (local nginx)
|
||||||
├── subreddits.json # Default list of subreddits to scan
|
├── requirements.txt # Python deps
|
||||||
├── templates/ # HTML templates for the web dashboard
|
├── setup.py # Installs console scripts
|
||||||
│ ├── base.html
|
├── subreddits.json # Default subreddits list
|
||||||
│ ├── index.html
|
├── reddit_stocks.db # SQLite database (generated/updated by CLI)
|
||||||
│ ├── subreddit.html
|
├── export_image.py # Generate shareable PNGs (Playwright)
|
||||||
│ ├── deep_dive.html
|
├── post_to_reddit.py # Post latest PNG to Reddit
|
||||||
│ ├── image_view.html
|
├── get_refresh_token.py # One-time OAuth2 refresh token helper
|
||||||
│ └── weekly_image_view.html
|
├── fetch_close_price.py # Utility for closing price (yfinance)
|
||||||
└── rstat_tool/ # The main source code package
|
├── fetch_market_cap.py # Utility for market cap (yfinance)
|
||||||
├── __init__.py
|
├── rstat_tool/
|
||||||
├── main.py # Scraper entry point and CLI logic
|
│ ├── main.py # CLI entry (rstat)
|
||||||
├── dashboard.py # Web dashboard entry point (Flask app)
|
│ ├── dashboard.py # Flask app entry (rstat-dashboard)
|
||||||
├── database.py # All SQLite database functions
|
│ ├── database.py # SQLite schema + queries
|
||||||
└── ...
|
│ ├── ticker_extractor.py # Ticker parsing + blacklist
|
||||||
|
│ ├── sentiment_analyzer.py # VADER sentiment
|
||||||
|
│ ├── cleanup.py # Cleanup utilities (rstat-cleanup)
|
||||||
|
│ ├── flair_finder.py # Fetch subreddit flair IDs (rstat-flairs)
|
||||||
|
│ ├── logger_setup.py # Logging
|
||||||
|
│ └── setup_nltk.py # One-time VADER download
|
||||||
|
├── templates/ # Jinja2 templates (Tailwind 4 styling)
|
||||||
|
└── static/ # Favicon + generated CSS (style.css)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Setup and Installation
|
## Requirements
|
||||||
|
|
||||||
Follow these steps to set up the project on your local machine.
|
- Python 3.10+ (Docker image uses Python 3.13-slim)
|
||||||
|
- Reddit API app (script type) for read + submit
|
||||||
|
- For optional image export: Playwright browsers
|
||||||
|
- For UI development (optional): Node 18+ to rebuild Tailwind CSS
|
||||||
|
|
||||||
### 1. Prerequisites
|
## Setup
|
||||||
* Python 3.7+
|
|
||||||
* Git
|
1) Clone and enter the repo
|
||||||
|
|
||||||
### 2. Clone the Repository
|
|
||||||
```bash
|
```bash
|
||||||
git clone <your-repository-url>
|
git clone <your-repo>
|
||||||
cd reddit_stock_analyzer
|
cd reddit_stock_analyzer
|
||||||
```
|
```
|
||||||
|
|
||||||
### 3. Set Up a Python Virtual Environment
|
2) Create and activate a virtualenv
|
||||||
It is highly recommended to use a virtual environment to manage dependencies.
|
|
||||||
|
|
||||||
**On macOS / Linux:**
|
- bash/zsh:
|
||||||
```bash
|
```bash
|
||||||
python3 -m venv .venv
|
python3 -m venv .venv
|
||||||
source .venv/bin/activate
|
source .venv/bin/activate
|
||||||
```
|
```
|
||||||
|
- fish:
|
||||||
|
```fish
|
||||||
|
python3 -m venv .venv
|
||||||
|
source .venv/bin/activate.fish
|
||||||
|
```
|
||||||
|
|
||||||
**On Windows:**
|
3) Install Python dependencies and commands
|
||||||
```bash
|
|
||||||
python -m venv .venv
|
|
||||||
.\.venv\Scripts\activate
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4. Install Dependencies
|
|
||||||
```bash
|
```bash
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
|
pip install -e .
|
||||||
```
|
```
|
||||||
|
|
||||||
### 5. Configure Reddit API Credentials
|
4) Configure environment
|
||||||
1. Go to the [Reddit Apps preferences page](https://www.reddit.com/prefs/apps) and create a new "script" app.
|
|
||||||
2. Create a file named `.env` in the root of the project directory.
|
|
||||||
3. Add your credentials to the `.env` file like this:
|
|
||||||
|
|
||||||
```
|
Create a `.env` file in the repo root with your Reddit app credentials:
|
||||||
REDDIT_CLIENT_ID=your_client_id_from_reddit
|
|
||||||
REDDIT_CLIENT_SECRET=your_client_secret_from_reddit
|
```
|
||||||
REDDIT_USER_AGENT=A custom user agent string (e.g., python:rstat:v1.2)
|
REDDIT_CLIENT_ID=your_client_id
|
||||||
```
|
REDDIT_CLIENT_SECRET=your_client_secret
|
||||||
|
REDDIT_USER_AGENT=python:rstat:v1.0 (by u/yourname)
|
||||||
|
```
|
||||||
|
|
||||||
|
Optional (after OAuth step below):
|
||||||
|
|
||||||
|
```
|
||||||
|
REDDIT_REFRESH_TOKEN=your_refresh_token
|
||||||
|
```
|
||||||
|
|
||||||
|
5) One-time NLTK setup
|
||||||
|
|
||||||
### 6. Set Up NLTK
|
|
||||||
Run the included setup script **once** to download the required `vader_lexicon` for sentiment analysis.
|
|
||||||
```bash
|
```bash
|
||||||
python rstat_tool/setup_nltk.py
|
python rstat_tool/setup_nltk.py
|
||||||
```
|
```
|
||||||
|
|
||||||
### 7. Set Up Playwright
|
6) Configure subreddits (optional)
|
||||||
Run the install routine for playwright. You might need to install some dependencies. Follow on-screen instruction if that's the case.
|
|
||||||
```bash
|
|
||||||
playwright install
|
|
||||||
```
|
|
||||||
|
|
||||||
### 8. Build and Install the Commands
|
Edit `subreddits.json` to your liking. It ships with a sane default list.
|
||||||
Install the tool in "editable" mode. This creates the `rstat` and `rstat-dashboard` commands in your virtual environment and links them to your source code.
|
|
||||||
|
## CLI usage (rstat)
|
||||||
|
|
||||||
|
The `rstat` command collects Reddit data and updates the database. Credentials are read from `.env`.
|
||||||
|
|
||||||
|
Common flags (see `rstat --help`):
|
||||||
|
|
||||||
|
- `--config FILE` Use a JSON file with `{"subreddits": [ ... ]}` (default: `subreddits.json`)
|
||||||
|
- `--subreddit NAME` Scan a single subreddit instead of the config
|
||||||
|
- `--days N` Only scan posts from the last N days (default 1)
|
||||||
|
- `--posts N` Max posts per subreddit to check (default 200)
|
||||||
|
- `--comments N` Max comments per post to scan (default 100)
|
||||||
|
- `--no-financials` Skip Yahoo Finance during the scan (faster)
|
||||||
|
- `--update-top-tickers` Update financials for tickers that are currently top daily/weekly
|
||||||
|
- `--update-financials-only [TICKER]` Update all or a single ticker’s market cap/close
|
||||||
|
- `--stdout` Log to console as well as file; `--debug` for verbose
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pip install -e .
|
# Scan configured subs for last 24h, including financials
|
||||||
```
|
rstat --days 1
|
||||||
The installation is now complete.
|
|
||||||
|
|
||||||
---
|
# Target a single subreddit for the past week, scan more comments
|
||||||
|
rstat --subreddit wallstreetbets --days 7 --comments 250
|
||||||
|
|
||||||
## Usage
|
# Skip financials during scan, then update only top tickers
|
||||||
|
rstat --no-financials
|
||||||
|
rstat --update-top-tickers
|
||||||
|
|
||||||
The tool is split into two commands: one for gathering data and one for viewing it.
|
# Update financials for all tickers in DB
|
||||||
|
rstat --update-financials-only
|
||||||
|
|
||||||
### 1. The Scraper (`rstat`)
|
# Update a single ticker (case-insensitive)
|
||||||
|
rstat --update-financials-only TSLA
|
||||||
This is the command-line tool you will use to populate the database. It is highly flexible.
|
|
||||||
|
|
||||||
**Common Commands:**
|
|
||||||
|
|
||||||
* **Run a daily scan (for cron jobs):** Scans subreddits from `subreddits.json` for posts in the last 24 hours.
|
|
||||||
```bash
|
|
||||||
rstat --config subreddits.json --days 1
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Scan a single subreddit:** Ignores the config file and scans just one subreddit.
|
|
||||||
```bash
|
|
||||||
rstat --subreddit wallstreetbets --days 1
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Back-fill data for last week:** Scans a specific subreddit for all new posts in the last 7 days.
|
|
||||||
```bash
|
|
||||||
rstat --subreddit Tollbugatabets --days 7
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Get help and see all options:**
|
|
||||||
```bash
|
|
||||||
rstat --help
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. The Web Dashboard (`rstat-dashboard`)
|
|
||||||
|
|
||||||
This command starts a local web server to let you explore the data you've collected.
|
|
||||||
|
|
||||||
**How to Run:**
|
|
||||||
1. Make sure you have run the `rstat` scraper at least once to populate the database.
|
|
||||||
2. Start the web server:
|
|
||||||
```bash
|
|
||||||
rstat-dashboard
|
|
||||||
```
|
|
||||||
3. Open your web browser and navigate to **http://127.0.0.1:5000**.
|
|
||||||
|
|
||||||
**Dashboard Features:**
|
|
||||||
* **Main Page:** Shows the Top 10 most mentioned tickers across all scanned subreddits.
|
|
||||||
* **Subreddit Pages:** Click any subreddit in the navigation bar to see a dashboard specific to that community.
|
|
||||||
* **Deep Dive:** In any table, click on a ticker's symbol to see a detailed breakdown of every post it was mentioned in.
|
|
||||||
* **Shareable Images:** On a subreddit's page, click "(View Daily Image)" or "(View Weekly Image)" to generate a polished, shareable summary card.
|
|
||||||
|
|
||||||
|
|
||||||
### 3. Exporting Shareable Images (`.png`)
|
|
||||||
|
|
||||||
In addition to viewing the dashboards in a browser, the project includes a powerful script to programmatically save the 'image views' as static `.png` files. This is ideal for automation, scheduled tasks (cron jobs), or sharing the results on social media platforms like your `r/rstat` subreddit.
|
|
||||||
|
|
||||||
#### One-Time Setup
|
|
||||||
|
|
||||||
The image exporter uses the Playwright library to control a headless browser. Before using it for the first time, you must install the necessary browser runtimes with this command:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
playwright install
|
|
||||||
```
|
```
|
||||||
|
|
||||||
#### Usage Workflow
|
How mentions are detected:
|
||||||
|
|
||||||
The exporter works by taking a high-quality screenshot of the live web page. Therefore, the process requires two steps running in two separate terminals.
|
- If a post contains any $TICKER (e.g., `$TSLA`) anywhere, we use “golden-only” mode: only $-prefixed tickers are considered.
|
||||||
|
- Otherwise, we fall back to filtered ALL-CAPS 2–5 letter words, excluding a large blacklist to avoid false positives.
|
||||||
|
- Title tickers attribute all comments in the thread; otherwise, we scan comments directly for mentions.
|
||||||
|
|
||||||
**Step 1: Start the Web Dashboard**
|
## Web dashboard (rstat-dashboard)
|
||||||
|
|
||||||
The web server must be running for the exporter to have a page to screenshot. Open a terminal and run:
|
Start the dashboard and open http://127.0.0.1:5000
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
rstat-dashboard
|
rstat-dashboard
|
||||||
```
|
```
|
||||||
Leave this terminal running.
|
|
||||||
|
|
||||||
**Step 2: Run the Export Script**
|
Features:
|
||||||
|
|
||||||
Open a **second terminal** in the same project directory. You can now run the `export_image.py` script with the desired arguments.
|
- Overall top 10 (daily/weekly) across all subs
|
||||||
|
- Per-subreddit dashboards (daily/weekly)
|
||||||
|
- Deep Dive pages listing posts analyzed for a ticker
|
||||||
|
- Shareable image-friendly views (UI hides nav when `?image=true`)
|
||||||
|
|
||||||
**Examples:**
|
The dashboard reads from `reddit_stocks.db`. Run `rstat` first so you have data.
|
||||||
|
|
||||||
* To export the **daily** summary image for `r/wallstreetbets`:
|
## Image export (export_image.py)
|
||||||
```bash
|
|
||||||
python export_image.py wallstreetbets
|
|
||||||
```
|
|
||||||
|
|
||||||
* To export the **weekly** summary image for `r/wallstreetbets`:
|
Exports a high-res PNG of the dashboard views via Playwright. Note: the script currently uses `https://rstat.net` as its base URL.
|
||||||
```bash
|
|
||||||
python export_image.py wallstreetbets --weekly
|
|
||||||
```
|
|
||||||
|
|
||||||
* To export the **overall** summary image (across all subreddits):
|
|
||||||
```bash
|
|
||||||
python export_image.py --overall
|
|
||||||
```
|
|
||||||
|
|
||||||
#### Output
|
|
||||||
|
|
||||||
After running a command, a new `.png` file (e.g., `wallstreetbets_daily_1690000000.png`) will be saved in the images-directory in the root directory of the project.
|
|
||||||
|
|
||||||
|
|
||||||
## 4. Full Automation: Posting to Reddit via Cron Job
|
|
||||||
|
|
||||||
The final piece of the project is a script that automates the entire pipeline: scraping data, generating an image, and posting it to a target subreddit like `r/rstat`. This is designed to be run via a scheduled task or cron job.
|
|
||||||
|
|
||||||
### Prerequisites: One-Time Account Authorization (OAuth2)
|
|
||||||
|
|
||||||
To post on your behalf, the script needs to be authorized with your Reddit account. This is done securely using OAuth2 and a `refresh_token`, which is compatible with 2-Factor Authentication (2FA). This is a **one-time setup process**.
|
|
||||||
|
|
||||||
**Step 1: Get Your Refresh Token**
|
|
||||||
|
|
||||||
1. First, ensure the "redirect uri" in your [Reddit App settings](https://www.reddit.com/prefs/apps) is set to **exactly** `http://localhost:8080`.
|
|
||||||
2. Run the temporary helper script included in the project:
|
|
||||||
```bash
|
|
||||||
python get_refresh_token.py
|
|
||||||
```
|
|
||||||
3. The script will print a unique URL. Copy this URL and paste it into your web browser.
|
|
||||||
4. Log in to the Reddit account you want to post from and click **"Allow"** when prompted.
|
|
||||||
5. You'll be redirected to a `localhost:8080` page that says "This site can’t be reached". **This is normal and expected.**
|
|
||||||
6. Copy the **full URL** from your browser's address bar. It will look something like `http://localhost:8080/?state=...&code=...`.
|
|
||||||
7. Paste this full URL back into the terminal where the script is waiting and press Enter.
|
|
||||||
8. The script will output your unique **refresh token**.
|
|
||||||
|
|
||||||
**Step 2: Update Your `.env` File**
|
|
||||||
|
|
||||||
1. Open your `.env` file.
|
|
||||||
2. Add a new line and paste your refresh token into it.
|
|
||||||
3. Ensure your file now contains the following (your username and password are no longer needed):
|
|
||||||
```
|
|
||||||
REDDIT_CLIENT_ID=your_client_id_from_reddit
|
|
||||||
REDDIT_CLIENT_SECRET=your_client_secret_from_reddit
|
|
||||||
REDDIT_USER_AGENT=A custom user agent string (e.g., python:rstat:v1.2)
|
|
||||||
REDDIT_REFRESH_TOKEN=the_long_refresh_token_string_you_just_copied
|
|
||||||
```
|
|
||||||
You can now safely delete the `get_refresh_token.py` script. Your application is now authorized to post on your behalf indefinitely.
|
|
||||||
|
|
||||||
### The `post_to_reddit.py` Script
|
|
||||||
|
|
||||||
This is the standalone script that finds the most recently generated image and posts it to Reddit using your new authorization.
|
|
||||||
|
|
||||||
**Manual Usage:**
|
|
||||||
|
|
||||||
* **Post the latest OVERALL summary image to `r/rstat`:**
|
|
||||||
```bash
|
|
||||||
python post_to_reddit.py
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Post the latest DAILY image for a specific subreddit:**
|
|
||||||
```bash
|
|
||||||
python post_to_reddit.py --subreddit wallstreetbets
|
|
||||||
```
|
|
||||||
|
|
||||||
* **Post the latest WEEKLY image for a specific subreddit:**
|
|
||||||
```bash
|
|
||||||
python post_to_reddit.py --subreddit wallstreetbets --weekly
|
|
||||||
```
|
|
||||||
|
|
||||||
### Setting Up the Cron Job
|
|
||||||
|
|
||||||
To run the entire pipeline automatically every day, you can use a simple shell script controlled by `cron`.
|
|
||||||
|
|
||||||
**Step 1: Create a Job Script**
|
|
||||||
|
|
||||||
Create a file named `run_daily_job.sh` in the root of your project directory.
|
|
||||||
|
|
||||||
**`run_daily_job.sh`:**
|
|
||||||
```bash
|
```bash
|
||||||
#!/bin/bash
|
# Overall daily image
|
||||||
|
|
||||||
# CRITICAL: Navigate to the project directory using an absolute path.
|
|
||||||
# Replace '/path/to/your/project/reddit_stock_analyzer' with your actual path.
|
|
||||||
cd /path/to/your/project/reddit_stock_analyzer
|
|
||||||
|
|
||||||
# CRITICAL: Activate the virtual environment using an absolute path.
|
|
||||||
source /path/to/your/project/reddit_stock_analyzer/.venv/bin/activate
|
|
||||||
|
|
||||||
echo "--- Starting RSTAT Daily Job on $(date) ---"
|
|
||||||
|
|
||||||
# 1. Scrape data from the last 24 hours.
|
|
||||||
echo "Step 1: Scraping new data..."
|
|
||||||
rstat --days 1
|
|
||||||
|
|
||||||
# 2. Start the dashboard in the background.
|
|
||||||
echo "Step 2: Starting dashboard in background..."
|
|
||||||
rstat-dashboard &
|
|
||||||
DASHBOARD_PID=$!
|
|
||||||
sleep 10
|
|
||||||
|
|
||||||
# 3. Export the overall summary image.
|
|
||||||
echo "Step 3: Exporting overall summary image..."
|
|
||||||
python export_image.py --overall
|
python export_image.py --overall
|
||||||
|
|
||||||
# 4. Post the image to r/rstat.
|
# Subreddit daily image
|
||||||
echo "Step 4: Posting image to Reddit..."
|
python export_image.py --subreddit wallstreetbets
|
||||||
python post_to_reddit.py --target-subreddit rstat
|
|
||||||
|
|
||||||
# 5. Clean up by stopping the dashboard server.
|
# Weekly view
|
||||||
echo "Step 5: Stopping dashboard server..."
|
python export_image.py --subreddit wallstreetbets --weekly
|
||||||
kill $DASHBOARD_PID
|
|
||||||
|
|
||||||
echo "--- RSTAT Daily Job Complete ---"
|
|
||||||
```
|
```
|
||||||
**Before proceeding, you must edit the two absolute paths at the top of this script to match your system.**
|
|
||||||
|
|
||||||
**Step 2: Make the Script Executable**
|
Output files are saved into the `images/` folder, e.g. `overall_summary_daily_1700000000.png`.
|
||||||
|
|
||||||
|
Tip: If you want to export from a local dashboard instead of rstat.net, edit `base_url` in `export_image.py`.
|
||||||
|
|
||||||
|
## Post images to Reddit (post_to_reddit.py)
|
||||||
|
|
||||||
|
One-time OAuth2 step to obtain a refresh token:
|
||||||
|
|
||||||
|
1) In your Reddit app settings, set the redirect URI to exactly `http://localhost:5000` (matches the script).
|
||||||
|
2) Run:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python get_refresh_token.py
|
||||||
|
```
|
||||||
|
|
||||||
|
Follow the on-screen steps: open the generated URL, allow, copy the redirected URL, paste back. Add the printed token to `.env` as `REDDIT_REFRESH_TOKEN`.
|
||||||
|
|
||||||
|
Now you can post:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Post the most recent overall image to r/rstat
|
||||||
|
python post_to_reddit.py
|
||||||
|
|
||||||
|
# Post the most recent daily image for a subreddit
|
||||||
|
python post_to_reddit.py --subreddit wallstreetbets
|
||||||
|
|
||||||
|
# Post weekly image for a subreddit
|
||||||
|
python post_to_reddit.py --subreddit wallstreetbets --weekly
|
||||||
|
|
||||||
|
# Choose a target subreddit and (optionally) a flair ID
|
||||||
|
python post_to_reddit.py --subreddit wallstreetbets --target-subreddit rstat --flair-id <ID>
|
||||||
|
```
|
||||||
|
|
||||||
|
Need a flair ID? Use the helper:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
rstat-flairs wallstreetbets
|
||||||
|
```
|
||||||
|
|
||||||
|
## Cleanup utilities (rstat-cleanup)
|
||||||
|
|
||||||
|
Remove blacklisted “ticker” rows and/or purge data for subreddits no longer in your config.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Show help
|
||||||
|
rstat-cleanup --help
|
||||||
|
|
||||||
|
# Remove tickers that are in the internal COMMON_WORDS_BLACKLIST
|
||||||
|
rstat-cleanup --tickers
|
||||||
|
|
||||||
|
# Remove any subreddit data not in subreddits.json
|
||||||
|
rstat-cleanup --subreddits
|
||||||
|
|
||||||
|
# Use a custom config file
|
||||||
|
rstat-cleanup --subreddits my_subs.json
|
||||||
|
|
||||||
|
# Run both tasks
|
||||||
|
rstat-cleanup --all
|
||||||
|
```
|
||||||
|
|
||||||
|
## Automation (cron)
|
||||||
|
|
||||||
|
An example `run_daily_job.sh` is provided. Update `BASE_DIR` and make it executable:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
chmod +x run_daily_job.sh
|
chmod +x run_daily_job.sh
|
||||||
```
|
```
|
||||||
|
|
||||||
**Step 3: Schedule the Cron Job**
|
Add a cron entry (example 22:00 daily):
|
||||||
|
|
||||||
1. Run `crontab -e` to open your crontab editor.
|
```
|
||||||
2. Add the following line to run the script every day at 10:00 PM and log its output:
|
0 22 * * * /absolute/path/to/reddit_stock_analyzer/run_daily_job.sh >> /absolute/path/to/reddit_stock_analyzer/cron.log 2>&1
|
||||||
|
```
|
||||||
|
|
||||||
```
|
## Docker
|
||||||
0 22 * * * /path/to/your/project/reddit_stock_analyzer/run_daily_job.sh >> /path/to/your/project/reddit_stock_analyzer/cron.log 2>&1
|
|
||||||
```
|
|
||||||
|
|
||||||
Your project is now fully and securely automated.
|
Builds a Tailwind CSS layer, then a Python runtime with gunicorn. The compose files include optional nginx and varnish.
|
||||||
|
|
||||||
|
Quick start for the dashboard only (uses your host `reddit_stocks.db`):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose up -d rstat-dashboard
|
||||||
|
```
|
||||||
|
|
||||||
|
Notes:
|
||||||
|
|
||||||
|
- The `rstat-dashboard` container mounts `./reddit_stocks.db` read-only. Populate it by running `rstat` on the host (or add a separate CLI container).
|
||||||
|
- Prod compose includes nginx (and optional certbot/varnish) configs under `config/`.
|
||||||
|
|
||||||
|
## Data model (SQLite)
|
||||||
|
|
||||||
|
- `tickers(id, symbol UNIQUE, market_cap, closing_price, last_updated)`
|
||||||
|
- `subreddits(id, name UNIQUE)`
|
||||||
|
- `mentions(id, ticker_id, subreddit_id, post_id, comment_id NULLABLE, mention_type, mention_sentiment, mention_timestamp, UNIQUE(ticker_id, post_id, comment_id))`
|
||||||
|
- `posts(id, post_id UNIQUE, title, post_url, subreddit_id, post_timestamp, comment_count, avg_comment_sentiment)`
|
||||||
|
|
||||||
|
Uniqueness prevents duplicates across post/comment granularity. Cleanup helpers remove blacklisted “tickers” and stale subreddits.
|
||||||
|
|
||||||
|
## UI and Tailwind
|
||||||
|
|
||||||
|
The CSS (`static/css/style.css`) is generated from `static/css/input.css` using Tailwind 4 during Docker build. If you want to tweak UI locally:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
npm install
|
||||||
|
npx tailwindcss -i ./static/css/input.css -o ./static/css/style.css --minify
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
- Missing VADER: Run `python rstat_tool/setup_nltk.py` once (in your venv).
|
||||||
|
- Playwright errors: Run `playwright install` once; ensure lib dependencies are present on your OS.
|
||||||
|
- yfinance returns None: Retry later; some tickers or regions can be spotty. The app tolerates missing financials.
|
||||||
|
- Flair required: If posting fails with flair errors, fetch a valid flair ID and pass `--flair-id`.
|
||||||
|
- Empty dashboards: Make sure `rstat` ran recently and `.env` is set; check `rstat.log`.
|
||||||
|
- DB locked: If you edit while the dashboard is reading, wait or stop the server; SQLite locks are short-lived.
|
||||||
|
|
||||||
|
## Safety and notes
|
||||||
|
|
||||||
|
- Do not commit `.env` or your database if it contains sensitive data.
|
||||||
|
- This project is for research/entertainment. Not investment advice.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Made with Python, Flask, NLTK, Playwright, and Tailwind.
|
@@ -19,7 +19,7 @@ def export_image(url_path, filename_prefix):
|
|||||||
|
|
||||||
os.makedirs(OUTPUT_DIR, exist_ok=True)
|
os.makedirs(OUTPUT_DIR, exist_ok=True)
|
||||||
|
|
||||||
base_url = "http://localhost"
|
base_url = "https://rstat.net"
|
||||||
# Ensure the URL path starts correctly
|
# Ensure the URL path starts correctly
|
||||||
url_path = url_path.lstrip("/")
|
url_path = url_path.lstrip("/")
|
||||||
url = f"{base_url}/{url_path}"
|
url = f"{base_url}/{url_path}"
|
||||||
|
@@ -8,7 +8,6 @@ import praw
|
|||||||
from dotenv import load_dotenv
|
from dotenv import load_dotenv
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
|
||||||
# --- CONFIGURATION ---
|
|
||||||
IMAGE_DIR = "images"
|
IMAGE_DIR = "images"
|
||||||
|
|
||||||
|
|
||||||
@@ -51,39 +50,33 @@ def find_latest_image(pattern):
|
|||||||
print(f"Error finding image file: {e}")
|
print(f"Error finding image file: {e}")
|
||||||
return None
|
return None
|
||||||
|
|
||||||
def get_flair_id(subreddit, flair_text):
|
|
||||||
"""
|
|
||||||
Attempts to find the ID of a flair by its text.
|
|
||||||
Returns the ID string or None if not found or an error occurs.
|
|
||||||
"""
|
|
||||||
if not flair_text:
|
|
||||||
return None
|
|
||||||
|
|
||||||
print(f"Attempting to find Flair ID for text: '{flair_text}'...")
|
|
||||||
try:
|
|
||||||
flairs = subreddit.flair.link_templates
|
|
||||||
for flair in flairs:
|
|
||||||
if flair['text'].lower() == flair_text.lower():
|
|
||||||
print(f" -> Found Flair ID: {flair['id']}")
|
|
||||||
return flair['id']
|
|
||||||
print(" -> Warning: No matching flair text found.")
|
|
||||||
return None
|
|
||||||
except Exception as e:
|
|
||||||
print(f" -> Warning: Could not fetch flairs for this subreddit (Error: {e}). Proceeding without flair.")
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
"""Main function to find an image and post it to Reddit."""
|
"""Main function to find an image and post it to Reddit."""
|
||||||
parser = argparse.ArgumentParser(
|
parser = argparse.ArgumentParser(
|
||||||
description="Find the latest sentiment image and post it to a subreddit."
|
description="Find the latest sentiment image and post it to a subreddit."
|
||||||
)
|
)
|
||||||
parser.add_argument("-s", "--subreddit", help="The source subreddit of the image to post. (Defaults to overall summary)")
|
parser.add_argument(
|
||||||
parser.add_argument("-w", "--weekly", action="store_true", help="Post the weekly summary instead of the daily one.")
|
"-s",
|
||||||
parser.add_argument("-t", "--target-subreddit", default="rstat", help="The subreddit to post the image to. (Default: rstat)")
|
"--subreddit",
|
||||||
parser.add_argument("--flair-text", help="The text of the flair to search for (e.g., 'Daily Summary').")
|
help="The source subreddit of the image to post. (Defaults to overall summary)",
|
||||||
parser.add_argument("--flair-id", help="Manually provide a specific Flair ID (overrides --flair-text).")
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"-w",
|
||||||
|
"--weekly",
|
||||||
|
action="store_true",
|
||||||
|
help="Post the weekly summary instead of the daily one.",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"-t",
|
||||||
|
"--target-subreddit",
|
||||||
|
default="rstat",
|
||||||
|
help="The subreddit to post the image to. (Default: rstat)",
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--flair-id",
|
||||||
|
help="The specific Flair ID to use for the post (required for some subreddits).",
|
||||||
|
)
|
||||||
|
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
@@ -125,23 +118,13 @@ def main():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
target_sub = reddit.subreddit(args.target_subreddit)
|
target_sub = reddit.subreddit(args.target_subreddit)
|
||||||
|
|
||||||
# --- NEW SMART FLAIR LOGIC ---
|
|
||||||
final_flair_id = None
|
|
||||||
if args.flair_id:
|
|
||||||
# If the user provides a specific ID, use it directly.
|
|
||||||
print(f"Using provided Flair ID: {args.flair_id}")
|
|
||||||
final_flair_id = args.flair_id
|
|
||||||
elif args.flair_text:
|
|
||||||
# If they provide text, try to find the ID automatically.
|
|
||||||
final_flair_id = get_flair_id(target_sub, args.flair_text)
|
|
||||||
|
|
||||||
print(f"Submitting '{post_title}' to r/{target_sub.display_name}...")
|
print(f"Submitting '{post_title}' to r/{target_sub.display_name}...")
|
||||||
|
|
||||||
|
# --- Simplified submission logic ---
|
||||||
submission = target_sub.submit_image(
|
submission = target_sub.submit_image(
|
||||||
title=post_title,
|
title=post_title,
|
||||||
image_path=image_to_post,
|
image_path=image_to_post,
|
||||||
flair_id=final_flair_id # This will be the found ID or None
|
flair_id=args.flair_id, # Directly use the provided ID. This will be None if not provided.
|
||||||
)
|
)
|
||||||
|
|
||||||
print("\n--- Post Successful! ---")
|
print("\n--- Post Successful! ---")
|
||||||
@@ -149,8 +132,14 @@ def main():
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"\nAn error occurred while posting to Reddit: {e}")
|
print(f"\nAn error occurred while posting to Reddit: {e}")
|
||||||
if 'FLAIR_REQUIRED' in str(e):
|
if "FLAIR_REQUIRED" in str(e).upper():
|
||||||
print("\nHint: This subreddit requires a flair. Try finding the flair text or ID and use the --flair-text or --flair-id argument.")
|
print(
|
||||||
|
"\nHINT: This subreddit requires a flair. You MUST provide a valid Flair ID using the --flair-id argument."
|
||||||
|
)
|
||||||
|
print(
|
||||||
|
"Please follow the manual steps to find the Flair ID using your browser's developer tools."
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
main()
|
main()
|
@@ -41,14 +41,20 @@ def overall_dashboard():
|
|||||||
view_type = request.args.get("view", "daily")
|
view_type = request.args.get("view", "daily")
|
||||||
is_image_mode = request.args.get("image") == "true"
|
is_image_mode = request.args.get("image") == "true"
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get the 'top' parameter, default to 10, and ensure it's an integer
|
||||||
|
top_n = int(request.args.get('top', 10))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
top_n = 10 # Fallback to 10 if the value is invalid
|
||||||
|
|
||||||
if view_type == "weekly":
|
if view_type == "weekly":
|
||||||
tickers, start, end = get_overall_weekly_summary()
|
tickers, start, end = get_overall_weekly_summary(limit=top_n)
|
||||||
date_string = f"{start.strftime('%b %d')} - {end.strftime('%b %d, %Y')}"
|
date_string = f"{start.strftime('%b %d')} - {end.strftime('%b %d, %Y')}"
|
||||||
subtitle = "All Subreddits - Top 10 Weekly"
|
subtitle = f"All Subreddits - Top {top_n} Weekly"
|
||||||
else: # Default to daily
|
else: # Default to daily
|
||||||
tickers = get_overall_daily_summary()
|
tickers = get_overall_daily_summary(limit=top_n)
|
||||||
date_string = datetime.now(timezone.utc).strftime("%Y-%m-%d")
|
date_string = datetime.now(timezone.utc).strftime("%Y-%m-%d")
|
||||||
subtitle = "All Subreddits - Top 10 Daily"
|
subtitle = f"All Subreddits - Top {top_n} Daily"
|
||||||
|
|
||||||
return render_template(
|
return render_template(
|
||||||
"dashboard_view.html",
|
"dashboard_view.html",
|
||||||
@@ -69,16 +75,21 @@ def subreddit_dashboard(name):
|
|||||||
view_type = request.args.get("view", "daily")
|
view_type = request.args.get("view", "daily")
|
||||||
is_image_mode = request.args.get("image") == "true"
|
is_image_mode = request.args.get("image") == "true"
|
||||||
|
|
||||||
|
try:
|
||||||
|
top_n = int(request.args.get('top', 10))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
top_n = 10
|
||||||
|
|
||||||
if view_type == "weekly":
|
if view_type == "weekly":
|
||||||
today = datetime.now(timezone.utc)
|
today = datetime.now(timezone.utc)
|
||||||
target_date = today - timedelta(days=7)
|
target_date = today - timedelta(days=7)
|
||||||
tickers, start, end = get_weekly_summary_for_subreddit(name, target_date)
|
tickers, start, end = get_weekly_summary_for_subreddit(name, target_date, limit=top_n)
|
||||||
date_string = f"{start.strftime('%b %d')} - {end.strftime('%b %d, %Y')}"
|
date_string = f"{start.strftime('%b %d')} - {end.strftime('%b %d, %Y')}"
|
||||||
subtitle = f"r/{name} - Top 10 Weekly"
|
subtitle = f"r/{name} - Top {top_n} Weekly"
|
||||||
else: # Default to daily
|
else: # Default to daily
|
||||||
tickers = get_daily_summary_for_subreddit(name)
|
tickers = get_daily_summary_for_subreddit(name, limit=top_n)
|
||||||
date_string = datetime.now(timezone.utc).strftime("%Y-%m-%d")
|
date_string = datetime.now(timezone.utc).strftime("%Y-%m-%d")
|
||||||
subtitle = f"r/{name} - Top 10 Daily"
|
subtitle = f"r/{name} - Top {top_n} Daily"
|
||||||
|
|
||||||
return render_template(
|
return render_template(
|
||||||
"dashboard_view.html",
|
"dashboard_view.html",
|
||||||
|
@@ -2,7 +2,11 @@
|
|||||||
|
|
||||||
import sqlite3
|
import sqlite3
|
||||||
import time
|
import time
|
||||||
from .ticker_extractor import COMMON_WORDS_BLACKLIST, extract_golden_tickers, extract_potential_tickers
|
from .ticker_extractor import (
|
||||||
|
COMMON_WORDS_BLACKLIST,
|
||||||
|
extract_golden_tickers,
|
||||||
|
extract_potential_tickers,
|
||||||
|
)
|
||||||
from .logger_setup import logger as log
|
from .logger_setup import logger as log
|
||||||
from datetime import datetime, timedelta, timezone
|
from datetime import datetime, timedelta, timezone
|
||||||
|
|
||||||
@@ -111,12 +115,14 @@ def initialize_db():
|
|||||||
ticker_id INTEGER,
|
ticker_id INTEGER,
|
||||||
subreddit_id INTEGER,
|
subreddit_id INTEGER,
|
||||||
post_id TEXT NOT NULL,
|
post_id TEXT NOT NULL,
|
||||||
|
comment_id TEXT, -- NEW: Will be NULL for post mentions
|
||||||
mention_type TEXT NOT NULL,
|
mention_type TEXT NOT NULL,
|
||||||
mention_sentiment REAL,
|
mention_sentiment REAL,
|
||||||
post_avg_sentiment REAL,
|
|
||||||
mention_timestamp INTEGER NOT NULL,
|
mention_timestamp INTEGER NOT NULL,
|
||||||
FOREIGN KEY (ticker_id) REFERENCES tickers (id),
|
FOREIGN KEY (ticker_id) REFERENCES tickers (id),
|
||||||
FOREIGN KEY (subreddit_id) REFERENCES subreddits (id)
|
FOREIGN KEY (subreddit_id) REFERENCES subreddits (id),
|
||||||
|
-- The new, perfect uniqueness rule:
|
||||||
|
UNIQUE(ticker_id, post_id, comment_id)
|
||||||
)
|
)
|
||||||
"""
|
"""
|
||||||
)
|
)
|
||||||
@@ -148,27 +154,27 @@ def add_mention(
|
|||||||
mention_type,
|
mention_type,
|
||||||
timestamp,
|
timestamp,
|
||||||
mention_sentiment,
|
mention_sentiment,
|
||||||
post_avg_sentiment=None,
|
comment_id=None,
|
||||||
):
|
):
|
||||||
cursor = conn.cursor()
|
cursor = conn.cursor()
|
||||||
try:
|
try:
|
||||||
cursor.execute(
|
cursor.execute(
|
||||||
"""
|
"""
|
||||||
INSERT INTO mentions (ticker_id, subreddit_id, post_id, mention_type, mention_timestamp, mention_sentiment, post_avg_sentiment)
|
INSERT INTO mentions (ticker_id, subreddit_id, post_id, comment_id, mention_type, mention_timestamp, mention_sentiment)
|
||||||
VALUES (?, ?, ?, ?, ?, ?, ?)
|
VALUES (?, ?, ?, ?, ?, ?, ?)
|
||||||
""",
|
""",
|
||||||
(
|
(
|
||||||
ticker_id,
|
ticker_id,
|
||||||
subreddit_id,
|
subreddit_id,
|
||||||
post_id,
|
post_id,
|
||||||
|
comment_id,
|
||||||
mention_type,
|
mention_type,
|
||||||
timestamp,
|
timestamp,
|
||||||
mention_sentiment,
|
mention_sentiment,
|
||||||
post_avg_sentiment,
|
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
conn.commit()
|
|
||||||
except sqlite3.IntegrityError:
|
except sqlite3.IntegrityError:
|
||||||
|
# This will now correctly catch and ignore any true duplicates.
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
@@ -231,7 +237,8 @@ def get_week_start_end(for_date):
|
|||||||
end_of_week = end_of_week.replace(hour=23, minute=59, second=59, microsecond=999999)
|
end_of_week = end_of_week.replace(hour=23, minute=59, second=59, microsecond=999999)
|
||||||
return start_of_week, end_of_week
|
return start_of_week, end_of_week
|
||||||
|
|
||||||
def get_overall_daily_summary():
|
|
||||||
|
def get_overall_daily_summary(limit=10):
|
||||||
"""Gets the top tickers across all subreddits from the LAST 24 HOURS."""
|
"""Gets the top tickers across all subreddits from the LAST 24 HOURS."""
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
one_day_ago = datetime.now(timezone.utc) - timedelta(days=1)
|
one_day_ago = datetime.now(timezone.utc) - timedelta(days=1)
|
||||||
@@ -243,13 +250,14 @@ def get_overall_daily_summary():
|
|||||||
FROM mentions m JOIN tickers t ON m.ticker_id = t.id
|
FROM mentions m JOIN tickers t ON m.ticker_id = t.id
|
||||||
WHERE m.mention_timestamp >= ?
|
WHERE m.mention_timestamp >= ?
|
||||||
GROUP BY t.symbol, t.market_cap, t.closing_price
|
GROUP BY t.symbol, t.market_cap, t.closing_price
|
||||||
ORDER BY total_mentions DESC LIMIT 10;
|
ORDER BY total_mentions DESC LIMIT ?;
|
||||||
"""
|
"""
|
||||||
results = conn.execute(query, (one_day_ago_timestamp,)).fetchall()
|
results = conn.execute(query, (one_day_ago_timestamp, limit)).fetchall()
|
||||||
conn.close()
|
conn.close()
|
||||||
return results
|
return results
|
||||||
|
|
||||||
def get_overall_weekly_summary():
|
|
||||||
|
def get_overall_weekly_summary(limit=10):
|
||||||
"""Gets the top tickers across all subreddits for LAST WEEK (Mon-Sun)."""
|
"""Gets the top tickers across all subreddits for LAST WEEK (Mon-Sun)."""
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
today = datetime.now(timezone.utc)
|
today = datetime.now(timezone.utc)
|
||||||
@@ -264,13 +272,14 @@ def get_overall_weekly_summary():
|
|||||||
FROM mentions m JOIN tickers t ON m.ticker_id = t.id
|
FROM mentions m JOIN tickers t ON m.ticker_id = t.id
|
||||||
WHERE m.mention_timestamp BETWEEN ? AND ?
|
WHERE m.mention_timestamp BETWEEN ? AND ?
|
||||||
GROUP BY t.symbol, t.market_cap, t.closing_price
|
GROUP BY t.symbol, t.market_cap, t.closing_price
|
||||||
ORDER BY total_mentions DESC LIMIT 10;
|
ORDER BY total_mentions DESC LIMIT ?;
|
||||||
"""
|
"""
|
||||||
results = conn.execute(query, (start_timestamp, end_timestamp)).fetchall()
|
results = conn.execute(query, (start_timestamp, end_timestamp, limit)).fetchall()
|
||||||
conn.close()
|
conn.close()
|
||||||
return results, start_of_week, end_of_week
|
return results, start_of_week, end_of_week
|
||||||
|
|
||||||
def get_daily_summary_for_subreddit(subreddit_name):
|
|
||||||
|
def get_daily_summary_for_subreddit(subreddit_name, limit=10):
|
||||||
"""Gets a summary for a subreddit's DAILY view (last 24 hours)."""
|
"""Gets a summary for a subreddit's DAILY view (last 24 hours)."""
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
one_day_ago = datetime.now(timezone.utc) - timedelta(days=1)
|
one_day_ago = datetime.now(timezone.utc) - timedelta(days=1)
|
||||||
@@ -282,13 +291,14 @@ def get_daily_summary_for_subreddit(subreddit_name):
|
|||||||
FROM mentions m JOIN tickers t ON m.ticker_id = t.id JOIN subreddits s ON m.subreddit_id = s.id
|
FROM mentions m JOIN tickers t ON m.ticker_id = t.id JOIN subreddits s ON m.subreddit_id = s.id
|
||||||
WHERE LOWER(s.name) = LOWER(?) AND m.mention_timestamp >= ?
|
WHERE LOWER(s.name) = LOWER(?) AND m.mention_timestamp >= ?
|
||||||
GROUP BY t.symbol, t.market_cap, t.closing_price
|
GROUP BY t.symbol, t.market_cap, t.closing_price
|
||||||
ORDER BY total_mentions DESC LIMIT 10;
|
ORDER BY total_mentions DESC LIMIT ?;
|
||||||
"""
|
"""
|
||||||
results = conn.execute(query, (subreddit_name, one_day_ago_timestamp)).fetchall()
|
results = conn.execute(query, (subreddit_name, one_day_ago_timestamp, limit)).fetchall()
|
||||||
conn.close()
|
conn.close()
|
||||||
return results
|
return results
|
||||||
|
|
||||||
def get_weekly_summary_for_subreddit(subreddit_name, for_date):
|
|
||||||
|
def get_weekly_summary_for_subreddit(subreddit_name, for_date, limit=10):
|
||||||
"""Gets a summary for a subreddit's WEEKLY view (for a specific week)."""
|
"""Gets a summary for a subreddit's WEEKLY view (for a specific week)."""
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
start_of_week, end_of_week = get_week_start_end(for_date)
|
start_of_week, end_of_week = get_week_start_end(for_date)
|
||||||
@@ -301,9 +311,11 @@ def get_weekly_summary_for_subreddit(subreddit_name, for_date):
|
|||||||
FROM mentions m JOIN tickers t ON m.ticker_id = t.id JOIN subreddits s ON m.subreddit_id = s.id
|
FROM mentions m JOIN tickers t ON m.ticker_id = t.id JOIN subreddits s ON m.subreddit_id = s.id
|
||||||
WHERE LOWER(s.name) = LOWER(?) AND m.mention_timestamp BETWEEN ? AND ?
|
WHERE LOWER(s.name) = LOWER(?) AND m.mention_timestamp BETWEEN ? AND ?
|
||||||
GROUP BY t.symbol, t.market_cap, t.closing_price
|
GROUP BY t.symbol, t.market_cap, t.closing_price
|
||||||
ORDER BY total_mentions DESC LIMIT 10;
|
ORDER BY total_mentions DESC LIMIT ?;
|
||||||
"""
|
"""
|
||||||
results = conn.execute(query, (subreddit_name, start_timestamp, end_timestamp)).fetchall()
|
results = conn.execute(
|
||||||
|
query, (subreddit_name, start_timestamp, end_timestamp, limit)
|
||||||
|
).fetchall()
|
||||||
conn.close()
|
conn.close()
|
||||||
return results, start_of_week, end_of_week
|
return results, start_of_week, end_of_week
|
||||||
|
|
||||||
|
76
rstat_tool/flair_finder.py
Normal file
76
rstat_tool/flair_finder.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
# rstat_tool/flair_finder.py
|
||||||
|
# A dedicated tool to find available link flairs for a subreddit.
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
import praw
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
def get_reddit_instance_for_flairs():
|
||||||
|
"""
|
||||||
|
Initializes and returns an authenticated PRAW instance using the refresh token.
|
||||||
|
This is a copy of the robust authentication from the posting script.
|
||||||
|
"""
|
||||||
|
# Find the .env file relative to the project root
|
||||||
|
env_path = Path(__file__).parent.parent / '.env'
|
||||||
|
load_dotenv(dotenv_path=env_path)
|
||||||
|
|
||||||
|
client_id = os.getenv("REDDIT_CLIENT_ID")
|
||||||
|
client_secret = os.getenv("REDDIT_CLIENT_SECRET")
|
||||||
|
user_agent = os.getenv("REDDIT_USER_AGENT")
|
||||||
|
refresh_token = os.getenv("REDDIT_REFRESH_TOKEN")
|
||||||
|
|
||||||
|
if not all([client_id, client_secret, user_agent, refresh_token]):
|
||||||
|
print("Error: Reddit API credentials (including REDDIT_REFRESH_TOKEN) must be set in .env file.", file=sys.stderr)
|
||||||
|
return None
|
||||||
|
|
||||||
|
return praw.Reddit(
|
||||||
|
client_id=client_id,
|
||||||
|
client_secret=client_secret,
|
||||||
|
user_agent=user_agent,
|
||||||
|
refresh_token=refresh_token
|
||||||
|
)
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main function to fetch and display flairs."""
|
||||||
|
parser = argparse.ArgumentParser(description="Fetch and display available post flairs for a subreddit.")
|
||||||
|
parser.add_argument("subreddit", help="The name of the subreddit to check.")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
reddit = get_reddit_instance_for_flairs()
|
||||||
|
if not reddit:
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
print(f"\n--- Attempting to Fetch Post Flairs for r/{args.subreddit} ---")
|
||||||
|
|
||||||
|
try:
|
||||||
|
# This uses PRAW's generic GET request method to hit the specific API endpoint.
|
||||||
|
api_path = f"/r/{args.subreddit}/api/link_flair_v2.json"
|
||||||
|
flairs = reddit.get(api_path, params={"raw_json": 1})
|
||||||
|
|
||||||
|
if not flairs:
|
||||||
|
print("No flairs found or flair list is empty for this subreddit.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print("\n--- Available Post Flairs ---")
|
||||||
|
found_count = 0
|
||||||
|
for flair in flairs:
|
||||||
|
flair_text = flair.get('text')
|
||||||
|
flair_id = flair.get('id')
|
||||||
|
if flair_text and flair_id:
|
||||||
|
print(f" Flair Text: '{flair_text}'")
|
||||||
|
print(f" Flair ID: {flair_id}\n")
|
||||||
|
found_count += 1
|
||||||
|
|
||||||
|
if found_count == 0:
|
||||||
|
print("No flairs with both text and ID were found.")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"\nAn error occurred: {e}", file=sys.stderr)
|
||||||
|
print("Hint: Please ensure the subreddit exists and that your authenticated user has permission to view it.", file=sys.stderr)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
@@ -3,116 +3,118 @@
|
|||||||
COMMON_WORDS_BLACKLIST = {
|
COMMON_WORDS_BLACKLIST = {
|
||||||
"401K", "403B", "457B", "AAVE", "ABC", "ABOUT", "ABOVE", "ACAT", "ADAM", "ADHD",
|
"401K", "403B", "457B", "AAVE", "ABC", "ABOUT", "ABOVE", "ACAT", "ADAM", "ADHD",
|
||||||
"ADR", "ADS", "ADX", "AEDT", "AEST", "AF", "AFAIK", "AFTER", "AGENT", "AH",
|
"ADR", "ADS", "ADX", "AEDT", "AEST", "AF", "AFAIK", "AFTER", "AGENT", "AH",
|
||||||
"AI", "AINT", "AK", "ALD", "ALGOS", "ALIVE", "ALL", "ALPHA", "ALSO", "AM",
|
"AI", "AINT", "AK", "AKSJE", "ALD", "ALGOS", "ALIVE", "ALL", "ALPHA", "ALSO",
|
||||||
"AMA", "AMEX", "AMK", "AMY", "AND", "ANSS", "ANY", "APES", "APL", "APPL",
|
"AM", "AMA", "AMEX", "AMK", "AMY", "AND", "ANSS", "ANY", "APES", "APL",
|
||||||
"APPLE", "APR", "APUS", "APY", "AR", "ARBK", "ARE", "AREA", "ARH", "ARK",
|
"APPL", "APPLE", "APR", "APUS", "APY", "AR", "ARBK", "ARE", "AREA", "ARH",
|
||||||
"AROUND", "ART", "AS", "ASAP", "ASEAN", "ASK", "ASS", "ASSET", "AST", "AT",
|
"ARK", "AROUND", "ART", "AS", "ASAP", "ASEAN", "ASK", "ASS", "ASSET", "AST",
|
||||||
"ATH", "ATL", "ATM", "AUD", "AUG", "AUM", "AV", "AVG", "AWS", "BABY",
|
"AT", "ATH", "ATL", "ATM", "AUD", "AUG", "AUM", "AV", "AVG", "AWS",
|
||||||
"BAG", "BAGS", "BALLS", "BAN", "BANG", "BASIC", "BBB", "BBBY", "BE", "BEAR",
|
"BABY", "BAD", "BAG", "BAGS", "BALLS", "BAN", "BANG", "BASIC", "BBB", "BBBY",
|
||||||
"BEARS", "BECN", "BEER", "BELL", "BELOW", "BETA", "BETS", "BF", "BID", "BIG",
|
"BE", "BEAR", "BEARS", "BECN", "BEER", "BELL", "BELOW", "BETA", "BETS", "BF",
|
||||||
"BIS", "BITCH", "BKEY", "BLEND", "BMW", "BNP", "BNPL", "BOE", "BOJ", "BOLL",
|
"BID", "BIG", "BIS", "BITCH", "BKEY", "BLEND", "BLOW", "BMW", "BNP", "BNPL",
|
||||||
"BOMB", "BOND", "BONED", "BORN", "BOTH", "BOTS", "BOY", "BOYS", "BRB", "BRICS",
|
"BOARD", "BOE", "BOJ", "BOLL", "BOMB", "BOND", "BONED", "BORN", "BOTH", "BOTS",
|
||||||
"BRK", "BRKA", "BRKB", "BRL", "BROKE", "BRRRR", "BS", "BSE", "BST", "BSU",
|
"BOY", "BOYS", "BRB", "BRICS", "BRK", "BRKA", "BRKB", "BRL", "BROKE", "BRRRR",
|
||||||
"BT", "BTC", "BTS", "BTW", "BUDDY", "BULL", "BULLS", "BUST", "BUT", "BUY",
|
"BS", "BSE", "BST", "BSU", "BT", "BTC", "BTS", "BTW", "BUDDY", "BULL",
|
||||||
"BUZZ", "CAD", "CAFE", "CAGR", "CALL", "CALLS", "CAN", "CAP", "CARB", "CARES",
|
"BULLS", "BUST", "BUT", "BUY", "BUZZ", "CAD", "CAFE", "CAGR", "CALL", "CALLS",
|
||||||
"CASE", "CATL", "CBD", "CBGM", "CBS", "CCI", "CCP", "CD", "CDN", "CEO",
|
"CAN", "CAP", "CARB", "CARES", "CASE", "CATL", "CBD", "CBGM", "CBS", "CCI",
|
||||||
"CEST", "CET", "CEX", "CFD", "CFO", "CFPB", "CHART", "CHASE", "CHATS", "CHECK",
|
"CCP", "CD", "CDN", "CEO", "CEST", "CET", "CEX", "CFD", "CFO", "CFPB",
|
||||||
"CHF", "CHICK", "CHIP", "CHIPS", "CIA", "CIC", "CLAIM", "CLEAN", "CLICK", "CLOSE",
|
"CHART", "CHASE", "CHATS", "CHECK", "CHF", "CHICK", "CHIP", "CHIPS", "CIA", "CIC",
|
||||||
"CMON", "CN", "CNBC", "CNN", "CNY", "COBRA", "COCK", "COGS", "COIL", "COKE",
|
"CLAIM", "CLEAN", "CLICK", "CLOSE", "CMON", "CN", "CNBC", "CNN", "CNY", "COBRA",
|
||||||
"COME", "COST", "COULD", "COVID", "CPAP", "CPI", "CRA", "CRE", "CRO", "CRV",
|
"COCK", "COGS", "COIL", "COKE", "COME", "COST", "COULD", "COVID", "CPAP", "CPI",
|
||||||
"CSE", "CSP", "CSS", "CST", "CTB", "CTEP", "CTO", "CUCKS", "CULT", "CUM",
|
"CRA", "CRE", "CRO", "CRV", "CSE", "CSP", "CSS", "CST", "CTB", "CTEP",
|
||||||
"CUSMA", "CUTS", "CUV", "CYCLE", "CZK", "DA", "DAILY", "DAO", "DATE", "DAX",
|
"CTO", "CUCKS", "CULT", "CUM", "CUSMA", "CUTS", "CUV", "CYCLE", "CZK", "DA",
|
||||||
"DAY", "DAYS", "DCA", "DCF", "DD", "DEAL", "DEBT", "DEEZ", "DEMO", "DET",
|
"DAILY", "DAO", "DART", "DATA", "DATE", "DAX", "DAY", "DAYS", "DCA", "DCF",
|
||||||
"DEX", "DGAF", "DIA", "DID", "DIDNT", "DIP", "DITM", "DIV", "DIY", "DJI",
|
"DD", "DEAL", "DEBT", "DEEZ", "DEMO", "DET", "DEX", "DGAF", "DIA", "DID",
|
||||||
"DJIA", "DJTJ", "DKK", "DL", "DM", "DMV", "DNI", "DNUTZ", "DO", "DOD",
|
"DIDNT", "DIP", "DITM", "DIV", "DIY", "DJI", "DJIA", "DJTJ", "DKK", "DL",
|
||||||
"DOE", "DOES", "DOGE", "DOING", "DOJ", "DOM", "DONNY", "DONT", "DONUT", "DOOR",
|
"DM", "DMV", "DNI", "DNUTZ", "DO", "DOD", "DOE", "DOES", "DOGE", "DOING",
|
||||||
"DOWN", "DOZEN", "DPI", "DR", "DUDE", "DUMP", "DUNT", "DUT", "DUTY", "DXY",
|
"DOJ", "DOM", "DONNY", "DONT", "DONUT", "DOOR", "DOWN", "DOZEN", "DPI", "DR",
|
||||||
"DXYXBT", "DYI", "DYNK", "DYODD", "DYOR", "EACH", "EARLY", "EARN", "EAST", "EASY",
|
"DUDE", "DUMP", "DUNT", "DUT", "DUTY", "DXY", "DXYXBT", "DYI", "DYNK", "DYODD",
|
||||||
"ECB", "EDGAR", "EDIT", "EDT", "EJ", "EMA", "EMJ", "EMT", "END", "ENRON",
|
"DYOR", "EACH", "EARLY", "EARN", "EAST", "EASY", "EBIT", "ECB", "EDGAR", "EDIT",
|
||||||
"ENSI", "ENV", "EO", "EOD", "EOM", "EOW", "EOY", "EPA", "EPK", "EPS",
|
"EDT", "EJ", "EMA", "EMJ", "EMT", "END", "ENRON", "ENSI", "ENV", "EO",
|
||||||
"ER", "ESG", "ESPP", "EST", "ETA", "ETF", "ETFS", "ETH", "ETL", "EU",
|
"EOD", "EOM", "EOW", "EOY", "EPA", "EPK", "EPS", "ER", "ESG", "ESPP",
|
||||||
"EUR", "EV", "EVEN", "EVERY", "EVTOL", "EXTRA", "EYES", "EZ", "FAANG", "FAFO",
|
"EST", "ETA", "ETF", "ETFS", "ETH", "ETHT", "ETL", "EU", "EUR", "EV",
|
||||||
"FAQ", "FAR", "FAST", "FBI", "FCC", "FCFF", "FD", "FDA", "FEE", "FFH",
|
"EVEN", "EVERY", "EVTOL", "EXTRA", "EYES", "EZ", "FAANG", "FAFO", "FAQ", "FAR",
|
||||||
"FFS", "FGMA", "FIG", "FIGMA", "FIHTX", "FILES", "FINAL", "FIND", "FING", "FINRA",
|
"FAST", "FBI", "FCC", "FCFF", "FD", "FDA", "FED", "FEE", "FFH", "FFS",
|
||||||
"FINT", "FINTX", "FINTY", "FIRE", "FIRST", "FKIN", "FLRAA", "FLT", "FLY", "FML",
|
"FGMA", "FIG", "FIGMA", "FIHTX", "FILES", "FINAL", "FIND", "FING", "FINRA", "FINT",
|
||||||
|
"FINTX", "FINTY", "FIRE", "FIRST", "FKIN", "FLOAT", "FLRAA", "FLT", "FLY", "FML",
|
||||||
"FOLO", "FOMC", "FOMO", "FOR", "FOREX", "FRAUD", "FREAK", "FRED", "FRG", "FROM",
|
"FOLO", "FOMC", "FOMO", "FOR", "FOREX", "FRAUD", "FREAK", "FRED", "FRG", "FROM",
|
||||||
"FRP", "FRS", "FSBO", "FSD", "FSE", "FSELK", "FSPSX", "FTD", "FTSE", "FUCK",
|
"FRP", "FRS", "FSBO", "FSD", "FSE", "FSELK", "FSPSX", "FTD", "FTSE", "FUCK",
|
||||||
"FUCKS", "FUD", "FULL", "FUND", "FUNNY", "FVG", "FWIW", "FX", "FXAIX", "FXIAX",
|
"FUCKS", "FUD", "FULL", "FUND", "FUNNY", "FVG", "FWIW", "FX", "FXAIX", "FXIAX",
|
||||||
"FXROX", "FY", "FYI", "FZROX", "GAAP", "GAIN", "GAVE", "GBP", "GC", "GDP",
|
"FXROX", "FY", "FYI", "FZROX", "GAAP", "GAIN", "GAV", "GAVE", "GBP", "GC",
|
||||||
"GET", "GFC", "GG", "GGTM", "GIVES", "GJ", "GL", "GLHF", "GMAT", "GMI",
|
"GDP", "GET", "GFC", "GG", "GGTM", "GIVES", "GJ", "GL", "GLHF", "GMAT",
|
||||||
"GMT", "GO", "GOAL", "GOAT", "GOD", "GOING", "GOLD", "GONE", "GONNA", "GOODS",
|
"GMI", "GMT", "GO", "GOAL", "GOAT", "GOD", "GOING", "GOLD", "GONE", "GONNA",
|
||||||
"GOPRO", "GPT", "GPU", "GRAB", "GREAT", "GREEN", "GSOV", "GST", "GTA", "GTC",
|
"GOODS", "GOPRO", "GPT", "GPU", "GRAB", "GREAT", "GREEN", "GSOV", "GST", "GTA",
|
||||||
"GTFO", "GTG", "GUH", "GUNS", "GUY", "GUYS", "HAD", "HAHA", "HALF", "HAM",
|
"GTC", "GTFO", "GTG", "GUH", "GUNS", "GUY", "GUYS", "HAD", "HAHA", "HALF",
|
||||||
"HANDS", "HAS", "HATE", "HAVE", "HBAR", "HCOL", "HEAR", "HEDGE", "HEGE", "HELD",
|
"HAM", "HANDS", "HAS", "HATE", "HAVE", "HBAR", "HCOL", "HEAR", "HEDGE", "HEGE",
|
||||||
"HELL", "HELP", "HERE", "HEY", "HFCS", "HFT", "HGTV", "HIGH", "HIGHS", "HINT",
|
"HELD", "HELE", "HELL", "HELP", "HERE", "HEY", "HFCS", "HFT", "HGTV", "HIGH",
|
||||||
"HIS", "HITID", "HK", "HKD", "HKEX", "HODL", "HODOR", "HOF", "HOLD", "HOLY",
|
"HIGHS", "HINT", "HIS", "HITID", "HK", "HKD", "HKEX", "HODL", "HODOR", "HOF",
|
||||||
"HOME", "HOT", "HOUR", "HOURS", "HOW", "HS", "HSA", "HSI", "HT", "HTCI",
|
"HOLD", "HOLY", "HOME", "HOT", "HOUR", "HOURS", "HOW", "HS", "HSA", "HSI",
|
||||||
"HTF", "HTML", "HUF", "HUGE", "HV", "HYPE", "IANAL", "IATF", "IB", "IBS",
|
"HT", "HTCI", "HTF", "HTML", "HUF", "HUGE", "HV", "HYPE", "IANAL", "IATF",
|
||||||
"ICSID", "ICT", "ID", "IDF", "IDK", "IF", "II", "IIRC", "IKKE", "IKZ",
|
"IB", "IBS", "ICSID", "ICT", "ID", "IDF", "IDK", "IF", "II", "IIRC",
|
||||||
"IM", "IMHO", "IMI", "IMO", "IN", "INC", "INR", "INTEL", "INTO", "IP",
|
"IKKE", "IKZ", "IM", "IMHO", "IMI", "IMO", "IN", "INC", "INR", "INTEL",
|
||||||
"IPO", "IQVIA", "IRA", "IRAS", "IRC", "IRISH", "IRMAA", "IRS", "IS", "ISA",
|
"INTO", "IP", "IPO", "IQVIA", "IRA", "IRAS", "IRC", "IRISH", "IRL", "IRMAA",
|
||||||
"ISIN", "ISM", "ISN", "IST", "IT", "ITC", "ITM", "ITS", "ITWN", "IUIT",
|
"IRS", "IS", "ISA", "ISIN", "ISM", "ISN", "IST", "IT", "ITC", "ITM",
|
||||||
"IV", "IVV", "IWM", "IXL", "IXLH", "IYKYK", "JAVA", "JD", "JDG", "JDM",
|
"ITS", "ITWN", "IUIT", "IV", "IVV", "IWM", "IXL", "IXLH", "IYKYK", "JAVA",
|
||||||
"JE", "JFC", "JK", "JLR", "JMO", "JOBS", "JOIN", "JOKE", "JP", "JPOW",
|
"JD", "JDG", "JDM", "JE", "JFC", "JK", "JLR", "JMO", "JOBS", "JOIN",
|
||||||
"JPY", "JS", "JST", "JUN", "JUST", "KARMA", "KEEP", "KILL", "KING", "KK",
|
"JOKE", "JP", "JPOW", "JPY", "JS", "JST", "JULY", "JUN", "JUST", "KARMA",
|
||||||
"KLA", "KLP", "KNEW", "KNOW", "KO", "KOHLS", "KPMG", "KRW", "LA", "LANGT",
|
"KEEP", "KILL", "KING", "KK", "KLA", "KLP", "KNEW", "KNOW", "KO", "KOHLS",
|
||||||
"LARGE", "LAST", "LATE", "LATER", "LBO", "LBTC", "LCS", "LDL", "LEADS", "LEAP",
|
"KPMG", "KRW", "LA", "LANGT", "LARGE", "LAST", "LATE", "LATER", "LBO", "LBTC",
|
||||||
"LEAPS", "LEARN", "LEI", "LET", "LETF", "LETS", "LFA", "LFG", "LFP", "LG",
|
"LCS", "LDL", "LEADS", "LEAP", "LEAPS", "LEARN", "LEGS", "LEI", "LET", "LETF",
|
||||||
"LGEN", "LIFE", "LIG", "LIGMA", "LIKE", "LIMIT", "LIST", "LLC", "LLM", "LM",
|
"LETS", "LFA", "LFG", "LFP", "LG", "LGEN", "LID", "LIFE", "LIG", "LIGMA",
|
||||||
"LMAO", "LMAOO", "LMM", "LMN", "LOANS", "LOKO", "LOL", "LOLOL", "LONG", "LONGS",
|
"LIKE", "LIMIT", "LIST", "LLC", "LLM", "LM", "LMAO", "LMAOO", "LMM", "LMN",
|
||||||
"LOOK", "LOSE", "LOSS", "LOST", "LOVE", "LOVES", "LOW", "LOWER", "LOWS", "LP",
|
"LOANS", "LOKO", "LOL", "LOLOL", "LONG", "LONGS", "LOOK", "LOSE", "LOSS", "LOST",
|
||||||
"LSS", "LTCG", "LUCID", "LUPD", "LYC", "LYING", "M&A", "MA", "MACD", "MAIL",
|
"LOVE", "LOVES", "LOW", "LOWER", "LOWS", "LP", "LSS", "LTCG", "LUCID", "LUPD",
|
||||||
"MAKE", "MAKES", "MANGE", "MANY", "MASON", "MAX", "MAY", "MAYBE", "MBA", "MC",
|
"LYC", "LYING", "M&A", "MA", "MACD", "MAIL", "MAKE", "MAKES", "MANGE", "MANY",
|
||||||
"MCAP", "MCNA", "MCP", "ME", "MEAN", "MEME", "MERGE", "MERK", "MES", "MEXC",
|
"MASON", "MAX", "MAY", "MAYBE", "MBA", "MC", "MCAP", "MCNA", "MCP", "ME",
|
||||||
"MF", "MFER", "MID", "MIGHT", "MIN", "MIND", "MINS", "ML", "MLB", "MLS",
|
"MEAN", "MEME", "MER", "MERGE", "MERK", "MES", "MEXC", "MF", "MFER", "MID",
|
||||||
"MM", "MMF", "MNQ", "MOASS", "MODEL", "MODTX", "MOM", "MONEY", "MONTH", "MONY",
|
"MIGHT", "MIN", "MIND", "MINS", "ML", "MLB", "MLS", "MM", "MMF", "MNQ",
|
||||||
"MOON", "MORE", "MOST", "MOU", "MSK", "MTVGA", "MUCH", "MUSIC", "MUST", "MVA",
|
"MOASS", "MODEL", "MODTX", "MOM", "MONEY", "MONGO", "MONTH", "MONY", "MOON", "MORE",
|
||||||
"MXN", "MY", "MYMD", "NASA", "NASDA", "NATO", "NAV", "NBA", "NBC", "NCAN",
|
"MOST", "MOU", "MSK", "MTVGA", "MUCH", "MUSIC", "MUST", "MVA", "MXN", "MY",
|
||||||
"NCR", "NEAR", "NEAT", "NEED", "NEVER", "NEW", "NEWS", "NEXT", "NFA", "NFC",
|
"MYMD", "NASA", "NASDA", "NATO", "NAV", "NBA", "NBC", "NCAN", "NCR", "NEAR",
|
||||||
"NFL", "NFT", "NGAD", "NGMI", "NIGHT", "NIQ", "NK", "NO", "NOK", "NON",
|
"NEAT", "NEED", "NEVER", "NEW", "NEWS", "NEXT", "NFA", "NFC", "NFL", "NFT",
|
||||||
"NONE", "NOOO", "NOPE", "NORTH", "NOT", "NOVA", "NOW", "NQ", "NRI", "NSA",
|
"NGAD", "NGMI", "NIGHT", "NIQ", "NK", "NO", "NOK", "NON", "NONE", "NOOO",
|
||||||
"NSCLC", "NSLC", "NTG", "NTVS", "NULL", "NUT", "NUTS", "NUTZ", "NVM", "NW",
|
"NOPE", "NORTH", "NOT", "NOVA", "NOW", "NQ", "NRI", "NSA", "NSCLC", "NSLC",
|
||||||
"NY", "NYSE", "NZ", "NZD", "OBBB", "OBI", "OBS", "OBV", "OCD", "OCF",
|
"NTG", "NTVS", "NULL", "NUT", "NUTS", "NUTZ", "NVIDIA", "NVM", "NW", "NY",
|
||||||
"OCO", "ODAT", "ODTE", "OEM", "OF", "OFA", "OFF", "OG", "OH", "OK",
|
"NYSE", "NZ", "NZD", "OBBB", "OBI", "OBS", "OBV", "OCD", "OCF", "OCO",
|
||||||
"OKAY", "OL", "OLD", "OMFG", "OMG", "ON", "ONDAS", "ONE", "ONLY", "OP",
|
"ODAT", "ODTE", "OEM", "OF", "OFA", "OFF", "OG", "OH", "OK", "OKAY",
|
||||||
"OPEC", "OPENQ", "OPEX", "OPRN", "OR", "ORB", "ORDER", "ORTEX", "OS", "OSCE",
|
"OL", "OLD", "OMFG", "OMG", "ON", "ONDAS", "ONE", "ONLY", "OP", "OPEC",
|
||||||
"OT", "OTC", "OTM", "OTOH", "OUCH", "OUGHT", "OUR", "OUT", "OVER", "OWN",
|
"OPENQ", "OPEX", "OPRN", "OR", "ORB", "ORDER", "ORTEX", "OS", "OSCE", "OSE",
|
||||||
"OZZY", "PA", "PANIC", "PC", "PDT", "PE", "PEAK", "PEG", "PETA", "PEW",
|
"OSEBX", "OT", "OTC", "OTM", "OTOH", "OUCH", "OUGHT", "OUR", "OUT", "OVER",
|
||||||
"PFC", "PGHL", "PIMCO", "PITA", "PLAN", "PLAYS", "PLC", "PLN", "PM", "PMCC",
|
"OWN", "OZZY", "PA", "PAID", "PANIC", "PC", "PDT", "PE", "PEAK", "PEG",
|
||||||
"PMI", "PNL", "POC", "POMO", "POP", "POS", "POSCO", "POTUS", "POV", "POW",
|
"PETA", "PEW", "PFC", "PGHL", "PIMCO", "PITA", "PLAN", "PLAYS", "PLC", "PLN",
|
||||||
"PPI", "PR", "PRICE", "PRIME", "PROFIT", "PROXY", "PS", "PSA", "PST", "PT",
|
"PM", "PMCC", "PMI", "PNL", "POC", "POMO", "POP", "POS", "POSCO", "POTUS",
|
||||||
"PTD", "PUSSY", "PUT", "PUTS", "PWC", "Q1", "Q2", "Q3", "Q4", "QE",
|
"POV", "POW", "PPI", "PR", "PRICE", "PRIME", "PROFIT", "PROXY", "PS", "PSA",
|
||||||
"QED", "QIMC", "QQQ", "QR", "RAM", "RATM", "RBA", "RBNZ", "RE", "REACH",
|
"PST", "PT", "PTD", "PUSSY", "PUT", "PUTS", "PWC", "Q1", "Q2", "Q3",
|
||||||
"READY", "REAL", "RED", "REIT", "REITS", "REKT", "REPE", "RFK", "RH", "RICO",
|
"Q4", "QE", "QED", "QIMC", "QQQ", "QR", "RAM", "RATM", "RBA", "RBNZ",
|
||||||
"RIDE", "RIGHT", "RIP", "RISK", "RISKY", "RNDC", "ROCE", "ROCK", "ROE", "ROFL",
|
"RE", "REACH", "READY", "REAL", "RED", "REIT", "REITS", "REKT", "REPE", "RFK",
|
||||||
"ROI", "ROIC", "ROTH", "RPO", "RRSP", "RSD", "RSI", "RT", "RTD", "RUB",
|
"RH", "RICO", "RIDE", "RIGHT", "RIP", "RISK", "RISKY", "RNDC", "ROCE", "ROCK",
|
||||||
"RUG", "RULE", "RUST", "RVOL", "SAGA", "SALES", "SAME", "SAVE", "SAYS", "SBF",
|
"ROE", "ROFL", "ROI", "ROIC", "ROTH", "RPO", "RRSP", "RSD", "RSI", "RT",
|
||||||
"SBLOC", "SC", "SCALP", "SCAM", "SCHB", "SCIF", "SEC", "SEE", "SEK", "SELL",
|
"RTD", "RUB", "RUG", "RULE", "RUST", "RVOL", "SAGA", "SALES", "SAME", "SAVE",
|
||||||
"SELLL", "SEP", "SESG", "SET", "SFOR", "SGD", "SHALL", "SHARE", "SHEIN", "SHELL",
|
"SAYS", "SBF", "SBLOC", "SC", "SCALP", "SCAM", "SCHB", "SCIF", "SEC", "SEE",
|
||||||
"SHIT", "SHORT", "SHOW", "SHS", "SHTF", "SI", "SICK", "SIGN", "SL", "SLIM",
|
"SEK", "SELL", "SELLL", "SEP", "SESG", "SET", "SFOR", "SGD", "SHALL", "SHARE",
|
||||||
"SLOW", "SMA", "SMALL", "SMFH", "SNZ", "SO", "SOLD", "SOLIS", "SOME", "SOON",
|
"SHEIN", "SHELL", "SHIT", "SHORT", "SHOW", "SHS", "SHTF", "SI", "SICK", "SIGN",
|
||||||
"SOOO", "SOUTH", "SP", "SPAC", "SPDR", "SPEND", "SPLG", "SPX", "SPY", "SQUAD",
|
"SL", "SLIM", "SLOW", "SMA", "SMALL", "SMFH", "SNZ", "SO", "SOLD", "SOLIS",
|
||||||
"SS", "SSA", "SSDI", "START", "STAY", "STEEL", "STFU", "STILL", "STO", "STOCK",
|
"SOME", "SOON", "SOOO", "SOUTH", "SP", "SPAC", "SPDR", "SPEND", "SPLG", "SPX",
|
||||||
"STOOQ", "STOP", "STOR", "STQQQ", "STUCK", "STUDY", "SUS", "SUSHI", "SUV", "SWIFT",
|
"SPY", "SQUAD", "SS", "SSA", "SSDI", "START", "STAY", "STEEL", "STFU", "STILL",
|
||||||
"SWING", "TA", "TAG", "TAKE", "TAM", "TBTH", "TEAMS", "TED", "TEMU", "TERM",
|
"STO", "STOCK", "STOOQ", "STOP", "STOR", "STQQQ", "STUCK", "STUDY", "SUS", "SUSHI",
|
||||||
"TESLA", "TEXT", "TF", "TFNA", "TFSA", "THAN", "THANK", "THAT", "THATS", "THE",
|
"SUV", "SWIFT", "SWING", "TA", "TAG", "TAKE", "TAM", "TBTH", "TEAMS", "TED",
|
||||||
"THEIR", "THEM", "THEN", "THERE", "THESE", "THEY", "THING", "THINK", "THIS", "TI",
|
"TEMU", "TERM", "TESLA", "TEXT", "TF", "TFNA", "TFSA", "THAN", "THANK", "THAT",
|
||||||
"TIA", "TIKR", "TIME", "TIMES", "TINA", "TITS", "TJR", "TL", "TL;DR", "TLDR",
|
"THATS", "THE", "THEIR", "THEM", "THEN", "THERE", "THESE", "THEY", "THING", "THINK",
|
||||||
"TNT", "TO", "TODAY", "TOLD", "TONS", "TOO", "TOS", "TOT", "TOTAL", "TP",
|
"THIS", "THROW", "TI", "TIA", "TIKR", "TIME", "TIMES", "TINA", "TITS", "TJR",
|
||||||
"TPU", "TRADE", "TREND", "TRUE", "TRUMP", "TRUST", "TRY", "TSA", "TSMC", "TSP",
|
"TL", "TL;DR", "TLDR", "TNT", "TO", "TODAY", "TOLD", "TONS", "TOO", "TOS",
|
||||||
"TSX", "TSXV", "TTIP", "TTM", "TTYL", "TURNS", "TWO", "UAW", "UCITS", "UGH",
|
"TOT", "TOTAL", "TP", "TPU", "TRADE", "TREND", "TRUE", "TRUMP", "TRUST", "TRY",
|
||||||
"UI", "UK", "UNDER", "UNITS", "UNO", "UNTIL", "UP", "US", "USA", "USD",
|
"TSA", "TSMC", "TSP", "TSX", "TSXV", "TTIP", "TTM", "TTYL", "TURNS", "TWO",
|
||||||
"USMCA", "USSA", "USSR", "UTC", "VALID", "VALUE", "VAMOS", "VAT", "VEO", "VERY",
|
"UAW", "UCITS", "UGH", "UI", "UK", "UNDER", "UNITS", "UNO", "UNTIL", "UP",
|
||||||
"VFMXX", "VFV", "VI", "VISA", "VIX", "VLI", "VOO", "VP", "VPAY", "VR",
|
"US", "USA", "USD", "USMCA", "USSA", "USSR", "UTC", "VALID", "VALUE", "VAMOS",
|
||||||
"VRVP", "VSUS", "VTI", "VUAG", "VW", "VWAP", "VWCE", "VXN", "VXUX", "WAGER",
|
"VAT", "VEIEN", "VEO", "VERY", "VFMXX", "VFV", "VI", "VISA", "VIX", "VLI",
|
||||||
"WAGMI", "WAIT", "WALL", "WANT", "WAS", "WATCH", "WAY", "WBTC", "WE", "WEB",
|
"VOO", "VP", "VPAY", "VR", "VRVP", "VSUS", "VTI", "VUAG", "VW", "VWAP",
|
||||||
"WEB3", "WEEK", "WENT", "WERO", "WEST", "WHALE", "WHAT", "WHEN", "WHERE", "WHICH",
|
"VWCE", "VXN", "VXUX", "WAGER", "WAGMI", "WAIT", "WALL", "WANT", "WAS", "WATCH",
|
||||||
"WHILE", "WHO", "WHOS", "WHY", "WIDE", "WILL", "WIRE", "WIRED", "WITH", "WL",
|
"WAY", "WBTC", "WE", "WEB", "WEB3", "WEEK", "WENT", "WERO", "WEST", "WHALE",
|
||||||
"WON", "WOOPS", "WORDS", "WORTH", "WOULD", "WP", "WRONG", "WSB", "WSJ", "WTF",
|
"WHAT", "WHEN", "WHERE", "WHICH", "WHILE", "WHO", "WHOS", "WHY", "WIDE", "WILL",
|
||||||
"WV", "WWII", "WWIII", "X", "XAU", "XCUSE", "XD", "XEQT", "XI", "XIV",
|
"WIRE", "WIRED", "WITH", "WL", "WON", "WOOPS", "WORDS", "WORTH", "WOULD", "WP",
|
||||||
"XMR", "XO", "XRP", "XX", "YEAH", "YEET", "YES", "YET", "YIELD", "YM",
|
"WRONG", "WSB", "WSJ", "WTF", "WV", "WWII", "WWIII", "X", "XAU", "XCUSE",
|
||||||
"YMMV", "YOIR", "YOLO", "YOU", "YOUR", "YOY", "YT", "YTD", "YUGE", "YUPPP",
|
"XD", "XEQT", "XI", "XIV", "XMR", "XO", "XRP", "XX", "YEAH", "YEET",
|
||||||
"ZAR", "ZEN", "ZERO", "ZEV"
|
"YES", "YET", "YIELD", "YM", "YMMV", "YOIR", "YOLO", "YOU", "YOUR", "YOY",
|
||||||
|
"YT", "YTD", "YUGE", "YUP", "YUPPP", "ZAR", "ZEN", "ZERO", "ZEV"
|
||||||
}
|
}
|
||||||
|
|
||||||
def format_and_print_list(word_set, words_per_line=10):
|
def format_and_print_list(word_set, words_per_line=10):
|
||||||
|
@@ -110,6 +110,7 @@ def _process_submission(submission, subreddit_id, conn, comment_limit):
|
|||||||
"post",
|
"post",
|
||||||
int(submission.created_utc),
|
int(submission.created_utc),
|
||||||
post_sentiment,
|
post_sentiment,
|
||||||
|
comment_id=None,
|
||||||
)
|
)
|
||||||
|
|
||||||
# 3. --- Process Comments (Single, Efficient Loop) ---
|
# 3. --- Process Comments (Single, Efficient Loop) ---
|
||||||
@@ -132,6 +133,7 @@ def _process_submission(submission, subreddit_id, conn, comment_limit):
|
|||||||
"comment",
|
"comment",
|
||||||
int(comment.created_utc),
|
int(comment.created_utc),
|
||||||
comment_sentiment,
|
comment_sentiment,
|
||||||
|
comment_id=comment.id,
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
# If no title tickers, we must scan the comment for direct mentions.
|
# If no title tickers, we must scan the comment for direct mentions.
|
||||||
@@ -156,6 +158,7 @@ def _process_submission(submission, subreddit_id, conn, comment_limit):
|
|||||||
"comment",
|
"comment",
|
||||||
int(comment.created_utc),
|
int(comment.created_utc),
|
||||||
comment_sentiment,
|
comment_sentiment,
|
||||||
|
comment_id=comment.id,
|
||||||
)
|
)
|
||||||
|
|
||||||
# 4. --- Save Deep Dive Analysis ---
|
# 4. --- Save Deep Dive Analysis ---
|
||||||
|
@@ -7,116 +7,118 @@ import re
|
|||||||
COMMON_WORDS_BLACKLIST = {
|
COMMON_WORDS_BLACKLIST = {
|
||||||
"401K", "403B", "457B", "AAVE", "ABC", "ABOUT", "ABOVE", "ACAT", "ADAM", "ADHD",
|
"401K", "403B", "457B", "AAVE", "ABC", "ABOUT", "ABOVE", "ACAT", "ADAM", "ADHD",
|
||||||
"ADR", "ADS", "ADX", "AEDT", "AEST", "AF", "AFAIK", "AFTER", "AGENT", "AH",
|
"ADR", "ADS", "ADX", "AEDT", "AEST", "AF", "AFAIK", "AFTER", "AGENT", "AH",
|
||||||
"AI", "AINT", "AK", "ALD", "ALGOS", "ALIVE", "ALL", "ALPHA", "ALSO", "AM",
|
"AI", "AINT", "AK", "AKSJE", "ALD", "ALGOS", "ALIVE", "ALL", "ALPHA", "ALSO",
|
||||||
"AMA", "AMEX", "AMK", "AMY", "AND", "ANSS", "ANY", "APES", "APL", "APPL",
|
"AM", "AMA", "AMEX", "AMK", "AMY", "AND", "ANSS", "ANY", "APES", "APL",
|
||||||
"APPLE", "APR", "APUS", "APY", "AR", "ARBK", "ARE", "AREA", "ARH", "ARK",
|
"APPL", "APPLE", "APR", "APUS", "APY", "AR", "ARBK", "ARE", "AREA", "ARH",
|
||||||
"AROUND", "ART", "AS", "ASAP", "ASEAN", "ASK", "ASS", "ASSET", "AST", "AT",
|
"ARK", "AROUND", "ART", "AS", "ASAP", "ASEAN", "ASK", "ASS", "ASSET", "AST",
|
||||||
"ATH", "ATL", "ATM", "AUD", "AUG", "AUM", "AV", "AVG", "AWS", "BABY",
|
"AT", "ATH", "ATL", "ATM", "AUD", "AUG", "AUM", "AV", "AVG", "AWS",
|
||||||
"BAG", "BAGS", "BALLS", "BAN", "BANG", "BASIC", "BBB", "BBBY", "BE", "BEAR",
|
"BABY", "BAD", "BAG", "BAGS", "BALLS", "BAN", "BANG", "BASIC", "BBB", "BBBY",
|
||||||
"BEARS", "BECN", "BEER", "BELL", "BELOW", "BETA", "BETS", "BF", "BID", "BIG",
|
"BE", "BEAR", "BEARS", "BECN", "BEER", "BELL", "BELOW", "BETA", "BETS", "BF",
|
||||||
"BIS", "BITCH", "BKEY", "BLEND", "BMW", "BNP", "BNPL", "BOE", "BOJ", "BOLL",
|
"BID", "BIG", "BIS", "BITCH", "BKEY", "BLEND", "BLOW", "BMW", "BNP", "BNPL",
|
||||||
"BOMB", "BOND", "BONED", "BORN", "BOTH", "BOTS", "BOY", "BOYS", "BRB", "BRICS",
|
"BOARD", "BOE", "BOJ", "BOLL", "BOMB", "BOND", "BONED", "BORN", "BOTH", "BOTS",
|
||||||
"BRK", "BRKA", "BRKB", "BRL", "BROKE", "BRRRR", "BS", "BSE", "BST", "BSU",
|
"BOY", "BOYS", "BRB", "BRICS", "BRK", "BRKA", "BRKB", "BRL", "BROKE", "BRRRR",
|
||||||
"BT", "BTC", "BTS", "BTW", "BUDDY", "BULL", "BULLS", "BUST", "BUT", "BUY",
|
"BS", "BSE", "BST", "BSU", "BT", "BTC", "BTS", "BTW", "BUDDY", "BULL",
|
||||||
"BUZZ", "CAD", "CAFE", "CAGR", "CALL", "CALLS", "CAN", "CAP", "CARB", "CARES",
|
"BULLS", "BUST", "BUT", "BUY", "BUZZ", "CAD", "CAFE", "CAGR", "CALL", "CALLS",
|
||||||
"CASE", "CATL", "CBD", "CBGM", "CBS", "CCI", "CCP", "CD", "CDN", "CEO",
|
"CAN", "CAP", "CARB", "CARES", "CASE", "CATL", "CBD", "CBGM", "CBS", "CCI",
|
||||||
"CEST", "CET", "CEX", "CFD", "CFO", "CFPB", "CHART", "CHASE", "CHATS", "CHECK",
|
"CCP", "CD", "CDN", "CEO", "CEST", "CET", "CEX", "CFD", "CFO", "CFPB",
|
||||||
"CHF", "CHICK", "CHIP", "CHIPS", "CIA", "CIC", "CLAIM", "CLEAN", "CLICK", "CLOSE",
|
"CHART", "CHASE", "CHATS", "CHECK", "CHF", "CHICK", "CHIP", "CHIPS", "CIA", "CIC",
|
||||||
"CMON", "CN", "CNBC", "CNN", "CNY", "COBRA", "COCK", "COGS", "COIL", "COKE",
|
"CLAIM", "CLEAN", "CLICK", "CLOSE", "CMON", "CN", "CNBC", "CNN", "CNY", "COBRA",
|
||||||
"COME", "COST", "COULD", "COVID", "CPAP", "CPI", "CRA", "CRE", "CRO", "CRV",
|
"COCK", "COGS", "COIL", "COKE", "COME", "COST", "COULD", "COVID", "CPAP", "CPI",
|
||||||
"CSE", "CSP", "CSS", "CST", "CTB", "CTEP", "CTO", "CUCKS", "CULT", "CUM",
|
"CRA", "CRE", "CRO", "CRV", "CSE", "CSP", "CSS", "CST", "CTB", "CTEP",
|
||||||
"CUSMA", "CUTS", "CUV", "CYCLE", "CZK", "DA", "DAILY", "DAO", "DATE", "DAX",
|
"CTO", "CUCKS", "CULT", "CUM", "CUSMA", "CUTS", "CUV", "CYCLE", "CZK", "DA",
|
||||||
"DAY", "DAYS", "DCA", "DCF", "DD", "DEAL", "DEBT", "DEEZ", "DEMO", "DET",
|
"DAILY", "DAO", "DART", "DATA", "DATE", "DAX", "DAY", "DAYS", "DCA", "DCF",
|
||||||
"DEX", "DGAF", "DIA", "DID", "DIDNT", "DIP", "DITM", "DIV", "DIY", "DJI",
|
"DD", "DEAL", "DEBT", "DEEZ", "DEMO", "DET", "DEX", "DGAF", "DIA", "DID",
|
||||||
"DJIA", "DJTJ", "DKK", "DL", "DM", "DMV", "DNI", "DNUTZ", "DO", "DOD",
|
"DIDNT", "DIP", "DITM", "DIV", "DIY", "DJI", "DJIA", "DJTJ", "DKK", "DL",
|
||||||
"DOE", "DOES", "DOGE", "DOING", "DOJ", "DOM", "DONNY", "DONT", "DONUT", "DOOR",
|
"DM", "DMV", "DNI", "DNUTZ", "DO", "DOD", "DOE", "DOES", "DOGE", "DOING",
|
||||||
"DOWN", "DOZEN", "DPI", "DR", "DUDE", "DUMP", "DUNT", "DUT", "DUTY", "DXY",
|
"DOJ", "DOM", "DONNY", "DONT", "DONUT", "DOOR", "DOWN", "DOZEN", "DPI", "DR",
|
||||||
"DXYXBT", "DYI", "DYNK", "DYODD", "DYOR", "EACH", "EARLY", "EARN", "EAST", "EASY",
|
"DUDE", "DUMP", "DUNT", "DUT", "DUTY", "DXY", "DXYXBT", "DYI", "DYNK", "DYODD",
|
||||||
"ECB", "EDGAR", "EDIT", "EDT", "EJ", "EMA", "EMJ", "EMT", "END", "ENRON",
|
"DYOR", "EACH", "EARLY", "EARN", "EAST", "EASY", "EBIT", "ECB", "EDGAR", "EDIT",
|
||||||
"ENSI", "ENV", "EO", "EOD", "EOM", "EOW", "EOY", "EPA", "EPK", "EPS",
|
"EDT", "EJ", "EMA", "EMJ", "EMT", "END", "ENRON", "ENSI", "ENV", "EO",
|
||||||
"ER", "ESG", "ESPP", "EST", "ETA", "ETF", "ETFS", "ETH", "ETL", "EU",
|
"EOD", "EOM", "EOW", "EOY", "EPA", "EPK", "EPS", "ER", "ESG", "ESPP",
|
||||||
"EUR", "EV", "EVEN", "EVERY", "EVTOL", "EXTRA", "EYES", "EZ", "FAANG", "FAFO",
|
"EST", "ETA", "ETF", "ETFS", "ETH", "ETHT", "ETL", "EU", "EUR", "EV",
|
||||||
"FAQ", "FAR", "FAST", "FBI", "FCC", "FCFF", "FD", "FDA", "FEE", "FFH",
|
"EVEN", "EVERY", "EVTOL", "EXTRA", "EYES", "EZ", "FAANG", "FAFO", "FAQ", "FAR",
|
||||||
"FFS", "FGMA", "FIG", "FIGMA", "FIHTX", "FILES", "FINAL", "FIND", "FING", "FINRA",
|
"FAST", "FBI", "FCC", "FCFF", "FD", "FDA", "FED", "FEE", "FFH", "FFS",
|
||||||
"FINT", "FINTX", "FINTY", "FIRE", "FIRST", "FKIN", "FLRAA", "FLT", "FLY", "FML",
|
"FGMA", "FIG", "FIGMA", "FIHTX", "FILES", "FINAL", "FIND", "FING", "FINRA", "FINT",
|
||||||
|
"FINTX", "FINTY", "FIRE", "FIRST", "FKIN", "FLOAT", "FLRAA", "FLT", "FLY", "FML",
|
||||||
"FOLO", "FOMC", "FOMO", "FOR", "FOREX", "FRAUD", "FREAK", "FRED", "FRG", "FROM",
|
"FOLO", "FOMC", "FOMO", "FOR", "FOREX", "FRAUD", "FREAK", "FRED", "FRG", "FROM",
|
||||||
"FRP", "FRS", "FSBO", "FSD", "FSE", "FSELK", "FSPSX", "FTD", "FTSE", "FUCK",
|
"FRP", "FRS", "FSBO", "FSD", "FSE", "FSELK", "FSPSX", "FTD", "FTSE", "FUCK",
|
||||||
"FUCKS", "FUD", "FULL", "FUND", "FUNNY", "FVG", "FWIW", "FX", "FXAIX", "FXIAX",
|
"FUCKS", "FUD", "FULL", "FUND", "FUNNY", "FVG", "FWIW", "FX", "FXAIX", "FXIAX",
|
||||||
"FXROX", "FY", "FYI", "FZROX", "GAAP", "GAIN", "GAVE", "GBP", "GC", "GDP",
|
"FXROX", "FY", "FYI", "FZROX", "GAAP", "GAIN", "GAV", "GAVE", "GBP", "GC",
|
||||||
"GET", "GFC", "GG", "GGTM", "GIVES", "GJ", "GL", "GLHF", "GMAT", "GMI",
|
"GDP", "GET", "GFC", "GG", "GGTM", "GIVES", "GJ", "GL", "GLHF", "GMAT",
|
||||||
"GMT", "GO", "GOAL", "GOAT", "GOD", "GOING", "GOLD", "GONE", "GONNA", "GOODS",
|
"GMI", "GMT", "GO", "GOAL", "GOAT", "GOD", "GOING", "GOLD", "GONE", "GONNA",
|
||||||
"GOPRO", "GPT", "GPU", "GRAB", "GREAT", "GREEN", "GSOV", "GST", "GTA", "GTC",
|
"GOODS", "GOPRO", "GPT", "GPU", "GRAB", "GREAT", "GREEN", "GSOV", "GST", "GTA",
|
||||||
"GTFO", "GTG", "GUH", "GUNS", "GUY", "GUYS", "HAD", "HAHA", "HALF", "HAM",
|
"GTC", "GTFO", "GTG", "GUH", "GUNS", "GUY", "GUYS", "HAD", "HAHA", "HALF",
|
||||||
"HANDS", "HAS", "HATE", "HAVE", "HBAR", "HCOL", "HEAR", "HEDGE", "HEGE", "HELD",
|
"HAM", "HANDS", "HAS", "HATE", "HAVE", "HBAR", "HCOL", "HEAR", "HEDGE", "HEGE",
|
||||||
"HELL", "HELP", "HERE", "HEY", "HFCS", "HFT", "HGTV", "HIGH", "HIGHS", "HINT",
|
"HELD", "HELE", "HELL", "HELP", "HERE", "HEY", "HFCS", "HFT", "HGTV", "HIGH",
|
||||||
"HIS", "HITID", "HK", "HKD", "HKEX", "HODL", "HODOR", "HOF", "HOLD", "HOLY",
|
"HIGHS", "HINT", "HIS", "HITID", "HK", "HKD", "HKEX", "HODL", "HODOR", "HOF",
|
||||||
"HOME", "HOT", "HOUR", "HOURS", "HOW", "HS", "HSA", "HSI", "HT", "HTCI",
|
"HOLD", "HOLY", "HOME", "HOT", "HOUR", "HOURS", "HOW", "HS", "HSA", "HSI",
|
||||||
"HTF", "HTML", "HUF", "HUGE", "HV", "HYPE", "IANAL", "IATF", "IB", "IBS",
|
"HT", "HTCI", "HTF", "HTML", "HUF", "HUGE", "HV", "HYPE", "IANAL", "IATF",
|
||||||
"ICSID", "ICT", "ID", "IDF", "IDK", "IF", "II", "IIRC", "IKKE", "IKZ",
|
"IB", "IBS", "ICSID", "ICT", "ID", "IDF", "IDK", "IF", "II", "IIRC",
|
||||||
"IM", "IMHO", "IMI", "IMO", "IN", "INC", "INR", "INTEL", "INTO", "IP",
|
"IKKE", "IKZ", "IM", "IMHO", "IMI", "IMO", "IN", "INC", "INR", "INTEL",
|
||||||
"IPO", "IQVIA", "IRA", "IRAS", "IRC", "IRISH", "IRMAA", "IRS", "IS", "ISA",
|
"INTO", "IP", "IPO", "IQVIA", "IRA", "IRAS", "IRC", "IRISH", "IRL", "IRMAA",
|
||||||
"ISIN", "ISM", "ISN", "IST", "IT", "ITC", "ITM", "ITS", "ITWN", "IUIT",
|
"IRS", "IS", "ISA", "ISIN", "ISM", "ISN", "IST", "IT", "ITC", "ITM",
|
||||||
"IV", "IVV", "IWM", "IXL", "IXLH", "IYKYK", "JAVA", "JD", "JDG", "JDM",
|
"ITS", "ITWN", "IUIT", "IV", "IVV", "IWM", "IXL", "IXLH", "IYKYK", "JAVA",
|
||||||
"JE", "JFC", "JK", "JLR", "JMO", "JOBS", "JOIN", "JOKE", "JP", "JPOW",
|
"JD", "JDG", "JDM", "JE", "JFC", "JK", "JLR", "JMO", "JOBS", "JOIN",
|
||||||
"JPY", "JS", "JST", "JUN", "JUST", "KARMA", "KEEP", "KILL", "KING", "KK",
|
"JOKE", "JP", "JPOW", "JPY", "JS", "JST", "JULY", "JUN", "JUST", "KARMA",
|
||||||
"KLA", "KLP", "KNEW", "KNOW", "KO", "KOHLS", "KPMG", "KRW", "LA", "LANGT",
|
"KEEP", "KILL", "KING", "KK", "KLA", "KLP", "KNEW", "KNOW", "KO", "KOHLS",
|
||||||
"LARGE", "LAST", "LATE", "LATER", "LBO", "LBTC", "LCS", "LDL", "LEADS", "LEAP",
|
"KPMG", "KRW", "LA", "LANGT", "LARGE", "LAST", "LATE", "LATER", "LBO", "LBTC",
|
||||||
"LEAPS", "LEARN", "LEI", "LET", "LETF", "LETS", "LFA", "LFG", "LFP", "LG",
|
"LCS", "LDL", "LEADS", "LEAP", "LEAPS", "LEARN", "LEGS", "LEI", "LET", "LETF",
|
||||||
"LGEN", "LIFE", "LIG", "LIGMA", "LIKE", "LIMIT", "LIST", "LLC", "LLM", "LM",
|
"LETS", "LFA", "LFG", "LFP", "LG", "LGEN", "LID", "LIFE", "LIG", "LIGMA",
|
||||||
"LMAO", "LMAOO", "LMM", "LMN", "LOANS", "LOKO", "LOL", "LOLOL", "LONG", "LONGS",
|
"LIKE", "LIMIT", "LIST", "LLC", "LLM", "LM", "LMAO", "LMAOO", "LMM", "LMN",
|
||||||
"LOOK", "LOSE", "LOSS", "LOST", "LOVE", "LOVES", "LOW", "LOWER", "LOWS", "LP",
|
"LOANS", "LOKO", "LOL", "LOLOL", "LONG", "LONGS", "LOOK", "LOSE", "LOSS", "LOST",
|
||||||
"LSS", "LTCG", "LUCID", "LUPD", "LYC", "LYING", "M&A", "MA", "MACD", "MAIL",
|
"LOVE", "LOVES", "LOW", "LOWER", "LOWS", "LP", "LSS", "LTCG", "LUCID", "LUPD",
|
||||||
"MAKE", "MAKES", "MANGE", "MANY", "MASON", "MAX", "MAY", "MAYBE", "MBA", "MC",
|
"LYC", "LYING", "M&A", "MA", "MACD", "MAIL", "MAKE", "MAKES", "MANGE", "MANY",
|
||||||
"MCAP", "MCNA", "MCP", "ME", "MEAN", "MEME", "MERGE", "MERK", "MES", "MEXC",
|
"MASON", "MAX", "MAY", "MAYBE", "MBA", "MC", "MCAP", "MCNA", "MCP", "ME",
|
||||||
"MF", "MFER", "MID", "MIGHT", "MIN", "MIND", "MINS", "ML", "MLB", "MLS",
|
"MEAN", "MEME", "MER", "MERGE", "MERK", "MES", "MEXC", "MF", "MFER", "MID",
|
||||||
"MM", "MMF", "MNQ", "MOASS", "MODEL", "MODTX", "MOM", "MONEY", "MONTH", "MONY",
|
"MIGHT", "MIN", "MIND", "MINS", "ML", "MLB", "MLS", "MM", "MMF", "MNQ",
|
||||||
"MOON", "MORE", "MOST", "MOU", "MSK", "MTVGA", "MUCH", "MUSIC", "MUST", "MVA",
|
"MOASS", "MODEL", "MODTX", "MOM", "MONEY", "MONGO", "MONTH", "MONY", "MOON", "MORE",
|
||||||
"MXN", "MY", "MYMD", "NASA", "NASDA", "NATO", "NAV", "NBA", "NBC", "NCAN",
|
"MOST", "MOU", "MSK", "MTVGA", "MUCH", "MUSIC", "MUST", "MVA", "MXN", "MY",
|
||||||
"NCR", "NEAR", "NEAT", "NEED", "NEVER", "NEW", "NEWS", "NEXT", "NFA", "NFC",
|
"MYMD", "NASA", "NASDA", "NATO", "NAV", "NBA", "NBC", "NCAN", "NCR", "NEAR",
|
||||||
"NFL", "NFT", "NGAD", "NGMI", "NIGHT", "NIQ", "NK", "NO", "NOK", "NON",
|
"NEAT", "NEED", "NEVER", "NEW", "NEWS", "NEXT", "NFA", "NFC", "NFL", "NFT",
|
||||||
"NONE", "NOOO", "NOPE", "NORTH", "NOT", "NOVA", "NOW", "NQ", "NRI", "NSA",
|
"NGAD", "NGMI", "NIGHT", "NIQ", "NK", "NO", "NOK", "NON", "NONE", "NOOO",
|
||||||
"NSCLC", "NSLC", "NTG", "NTVS", "NULL", "NUT", "NUTS", "NUTZ", "NVM", "NW",
|
"NOPE", "NORTH", "NOT", "NOVA", "NOW", "NQ", "NRI", "NSA", "NSCLC", "NSLC",
|
||||||
"NY", "NYSE", "NZ", "NZD", "OBBB", "OBI", "OBS", "OBV", "OCD", "OCF",
|
"NTG", "NTVS", "NULL", "NUT", "NUTS", "NUTZ", "NVIDIA", "NVM", "NW", "NY",
|
||||||
"OCO", "ODAT", "ODTE", "OEM", "OF", "OFA", "OFF", "OG", "OH", "OK",
|
"NYSE", "NZ", "NZD", "OBBB", "OBI", "OBS", "OBV", "OCD", "OCF", "OCO",
|
||||||
"OKAY", "OL", "OLD", "OMFG", "OMG", "ON", "ONDAS", "ONE", "ONLY", "OP",
|
"ODAT", "ODTE", "OEM", "OF", "OFA", "OFF", "OG", "OH", "OK", "OKAY",
|
||||||
"OPEC", "OPENQ", "OPEX", "OPRN", "OR", "ORB", "ORDER", "ORTEX", "OS", "OSCE",
|
"OL", "OLD", "OMFG", "OMG", "ON", "ONDAS", "ONE", "ONLY", "OP", "OPEC",
|
||||||
"OT", "OTC", "OTM", "OTOH", "OUCH", "OUGHT", "OUR", "OUT", "OVER", "OWN",
|
"OPENQ", "OPEX", "OPRN", "OR", "ORB", "ORDER", "ORTEX", "OS", "OSCE", "OSE",
|
||||||
"OZZY", "PA", "PANIC", "PC", "PDT", "PE", "PEAK", "PEG", "PETA", "PEW",
|
"OSEBX", "OT", "OTC", "OTM", "OTOH", "OUCH", "OUGHT", "OUR", "OUT", "OVER",
|
||||||
"PFC", "PGHL", "PIMCO", "PITA", "PLAN", "PLAYS", "PLC", "PLN", "PM", "PMCC",
|
"OWN", "OZZY", "PA", "PAID", "PANIC", "PC", "PDT", "PE", "PEAK", "PEG",
|
||||||
"PMI", "PNL", "POC", "POMO", "POP", "POS", "POSCO", "POTUS", "POV", "POW",
|
"PETA", "PEW", "PFC", "PGHL", "PIMCO", "PITA", "PLAN", "PLAYS", "PLC", "PLN",
|
||||||
"PPI", "PR", "PRICE", "PRIME", "PROFIT", "PROXY", "PS", "PSA", "PST", "PT",
|
"PM", "PMCC", "PMI", "PNL", "POC", "POMO", "POP", "POS", "POSCO", "POTUS",
|
||||||
"PTD", "PUSSY", "PUT", "PUTS", "PWC", "Q1", "Q2", "Q3", "Q4", "QE",
|
"POV", "POW", "PPI", "PR", "PRICE", "PRIME", "PROFIT", "PROXY", "PS", "PSA",
|
||||||
"QED", "QIMC", "QQQ", "QR", "RAM", "RATM", "RBA", "RBNZ", "RE", "REACH",
|
"PST", "PT", "PTD", "PUSSY", "PUT", "PUTS", "PWC", "Q1", "Q2", "Q3",
|
||||||
"READY", "REAL", "RED", "REIT", "REITS", "REKT", "REPE", "RFK", "RH", "RICO",
|
"Q4", "QE", "QED", "QIMC", "QQQ", "QR", "RAM", "RATM", "RBA", "RBNZ",
|
||||||
"RIDE", "RIGHT", "RIP", "RISK", "RISKY", "RNDC", "ROCE", "ROCK", "ROE", "ROFL",
|
"RE", "REACH", "READY", "REAL", "RED", "REIT", "REITS", "REKT", "REPE", "RFK",
|
||||||
"ROI", "ROIC", "ROTH", "RPO", "RRSP", "RSD", "RSI", "RT", "RTD", "RUB",
|
"RH", "RICO", "RIDE", "RIGHT", "RIP", "RISK", "RISKY", "RNDC", "ROCE", "ROCK",
|
||||||
"RUG", "RULE", "RUST", "RVOL", "SAGA", "SALES", "SAME", "SAVE", "SAYS", "SBF",
|
"ROE", "ROFL", "ROI", "ROIC", "ROTH", "RPO", "RRSP", "RSD", "RSI", "RT",
|
||||||
"SBLOC", "SC", "SCALP", "SCAM", "SCHB", "SCIF", "SEC", "SEE", "SEK", "SELL",
|
"RTD", "RUB", "RUG", "RULE", "RUST", "RVOL", "SAGA", "SALES", "SAME", "SAVE",
|
||||||
"SELLL", "SEP", "SESG", "SET", "SFOR", "SGD", "SHALL", "SHARE", "SHEIN", "SHELL",
|
"SAYS", "SBF", "SBLOC", "SC", "SCALP", "SCAM", "SCHB", "SCIF", "SEC", "SEE",
|
||||||
"SHIT", "SHORT", "SHOW", "SHS", "SHTF", "SI", "SICK", "SIGN", "SL", "SLIM",
|
"SEK", "SELL", "SELLL", "SEP", "SESG", "SET", "SFOR", "SGD", "SHALL", "SHARE",
|
||||||
"SLOW", "SMA", "SMALL", "SMFH", "SNZ", "SO", "SOLD", "SOLIS", "SOME", "SOON",
|
"SHEIN", "SHELL", "SHIT", "SHORT", "SHOW", "SHS", "SHTF", "SI", "SICK", "SIGN",
|
||||||
"SOOO", "SOUTH", "SP", "SPAC", "SPDR", "SPEND", "SPLG", "SPX", "SPY", "SQUAD",
|
"SL", "SLIM", "SLOW", "SMA", "SMALL", "SMFH", "SNZ", "SO", "SOLD", "SOLIS",
|
||||||
"SS", "SSA", "SSDI", "START", "STAY", "STEEL", "STFU", "STILL", "STO", "STOCK",
|
"SOME", "SOON", "SOOO", "SOUTH", "SP", "SPAC", "SPDR", "SPEND", "SPLG", "SPX",
|
||||||
"STOOQ", "STOP", "STOR", "STQQQ", "STUCK", "STUDY", "SUS", "SUSHI", "SUV", "SWIFT",
|
"SPY", "SQUAD", "SS", "SSA", "SSDI", "START", "STAY", "STEEL", "STFU", "STILL",
|
||||||
"SWING", "TA", "TAG", "TAKE", "TAM", "TBTH", "TEAMS", "TED", "TEMU", "TERM",
|
"STO", "STOCK", "STOOQ", "STOP", "STOR", "STQQQ", "STUCK", "STUDY", "SUS", "SUSHI",
|
||||||
"TESLA", "TEXT", "TF", "TFNA", "TFSA", "THAN", "THANK", "THAT", "THATS", "THE",
|
"SUV", "SWIFT", "SWING", "TA", "TAG", "TAKE", "TAM", "TBTH", "TEAMS", "TED",
|
||||||
"THEIR", "THEM", "THEN", "THERE", "THESE", "THEY", "THING", "THINK", "THIS", "TI",
|
"TEMU", "TERM", "TESLA", "TEXT", "TF", "TFNA", "TFSA", "THAN", "THANK", "THAT",
|
||||||
"TIA", "TIKR", "TIME", "TIMES", "TINA", "TITS", "TJR", "TL", "TL;DR", "TLDR",
|
"THATS", "THE", "THEIR", "THEM", "THEN", "THERE", "THESE", "THEY", "THING", "THINK",
|
||||||
"TNT", "TO", "TODAY", "TOLD", "TONS", "TOO", "TOS", "TOT", "TOTAL", "TP",
|
"THIS", "THROW", "TI", "TIA", "TIKR", "TIME", "TIMES", "TINA", "TITS", "TJR",
|
||||||
"TPU", "TRADE", "TREND", "TRUE", "TRUMP", "TRUST", "TRY", "TSA", "TSMC", "TSP",
|
"TL", "TL;DR", "TLDR", "TNT", "TO", "TODAY", "TOLD", "TONS", "TOO", "TOS",
|
||||||
"TSX", "TSXV", "TTIP", "TTM", "TTYL", "TURNS", "TWO", "UAW", "UCITS", "UGH",
|
"TOT", "TOTAL", "TP", "TPU", "TRADE", "TREND", "TRUE", "TRUMP", "TRUST", "TRY",
|
||||||
"UI", "UK", "UNDER", "UNITS", "UNO", "UNTIL", "UP", "US", "USA", "USD",
|
"TSA", "TSMC", "TSP", "TSX", "TSXV", "TTIP", "TTM", "TTYL", "TURNS", "TWO",
|
||||||
"USMCA", "USSA", "USSR", "UTC", "VALID", "VALUE", "VAMOS", "VAT", "VEO", "VERY",
|
"UAW", "UCITS", "UGH", "UI", "UK", "UNDER", "UNITS", "UNO", "UNTIL", "UP",
|
||||||
"VFMXX", "VFV", "VI", "VISA", "VIX", "VLI", "VOO", "VP", "VPAY", "VR",
|
"US", "USA", "USD", "USMCA", "USSA", "USSR", "UTC", "VALID", "VALUE", "VAMOS",
|
||||||
"VRVP", "VSUS", "VTI", "VUAG", "VW", "VWAP", "VWCE", "VXN", "VXUX", "WAGER",
|
"VAT", "VEIEN", "VEO", "VERY", "VFMXX", "VFV", "VI", "VISA", "VIX", "VLI",
|
||||||
"WAGMI", "WAIT", "WALL", "WANT", "WAS", "WATCH", "WAY", "WBTC", "WE", "WEB",
|
"VOO", "VP", "VPAY", "VR", "VRVP", "VSUS", "VTI", "VUAG", "VW", "VWAP",
|
||||||
"WEB3", "WEEK", "WENT", "WERO", "WEST", "WHALE", "WHAT", "WHEN", "WHERE", "WHICH",
|
"VWCE", "VXN", "VXUX", "WAGER", "WAGMI", "WAIT", "WALL", "WANT", "WAS", "WATCH",
|
||||||
"WHILE", "WHO", "WHOS", "WHY", "WIDE", "WILL", "WIRE", "WIRED", "WITH", "WL",
|
"WAY", "WBTC", "WE", "WEB", "WEB3", "WEEK", "WENT", "WERO", "WEST", "WHALE",
|
||||||
"WON", "WOOPS", "WORDS", "WORTH", "WOULD", "WP", "WRONG", "WSB", "WSJ", "WTF",
|
"WHAT", "WHEN", "WHERE", "WHICH", "WHILE", "WHO", "WHOS", "WHY", "WIDE", "WILL",
|
||||||
"WV", "WWII", "WWIII", "X", "XAU", "XCUSE", "XD", "XEQT", "XI", "XIV",
|
"WIRE", "WIRED", "WITH", "WL", "WON", "WOOPS", "WORDS", "WORTH", "WOULD", "WP",
|
||||||
"XMR", "XO", "XRP", "XX", "YEAH", "YEET", "YES", "YET", "YIELD", "YM",
|
"WRONG", "WSB", "WSJ", "WTF", "WV", "WWII", "WWIII", "X", "XAU", "XCUSE",
|
||||||
"YMMV", "YOIR", "YOLO", "YOU", "YOUR", "YOY", "YT", "YTD", "YUGE", "YUPPP",
|
"XD", "XEQT", "XI", "XIV", "XMR", "XO", "XRP", "XX", "YEAH", "YEET",
|
||||||
"ZAR", "ZEN", "ZERO", "ZEV"
|
"YES", "YET", "YIELD", "YM", "YMMV", "YOIR", "YOLO", "YOU", "YOUR", "YOY",
|
||||||
|
"YT", "YTD", "YUGE", "YUP", "YUPPP", "ZAR", "ZEN", "ZERO", "ZEV"
|
||||||
}
|
}
|
||||||
|
|
||||||
def extract_golden_tickers(text):
|
def extract_golden_tickers(text):
|
||||||
|
3
setup.py
3
setup.py
@@ -7,7 +7,7 @@ with open("requirements.txt") as f:
|
|||||||
|
|
||||||
setup(
|
setup(
|
||||||
name="reddit-stock-analyzer",
|
name="reddit-stock-analyzer",
|
||||||
version="0.0.1",
|
version="0.0.2",
|
||||||
author="Pål-Kristian Hamre",
|
author="Pål-Kristian Hamre",
|
||||||
author_email="its@pkhamre.com",
|
author_email="its@pkhamre.com",
|
||||||
description="A command-line tool to analyze stock ticker mentions on Reddit.",
|
description="A command-line tool to analyze stock ticker mentions on Reddit.",
|
||||||
@@ -20,6 +20,7 @@ setup(
|
|||||||
"rstat=rstat_tool.main:main",
|
"rstat=rstat_tool.main:main",
|
||||||
"rstat-dashboard=rstat_tool.dashboard:start_dashboard",
|
"rstat-dashboard=rstat_tool.dashboard:start_dashboard",
|
||||||
"rstat-cleanup=rstat_tool.cleanup:run_cleanup",
|
"rstat-cleanup=rstat_tool.cleanup:run_cleanup",
|
||||||
|
"rstat-flairs=rstat_tool.flair_finder:main",
|
||||||
],
|
],
|
||||||
},
|
},
|
||||||
)
|
)
|
||||||
|
@@ -49,7 +49,7 @@
|
|||||||
<div class="text-center">Mentions</div>
|
<div class="text-center">Mentions</div>
|
||||||
<div class="text-center">Sentiment</div>
|
<div class="text-center">Sentiment</div>
|
||||||
<div>Mkt Cap</div>
|
<div>Mkt Cap</div>
|
||||||
<div>Close Price</div>
|
<div>Last Price</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -96,11 +96,10 @@
|
|||||||
<div class="text-lg font-semibold text-white">{{ ticker.market_cap | format_mc }}</div>
|
<div class="text-lg font-semibold text-white">{{ ticker.market_cap | format_mc }}</div>
|
||||||
</div>
|
</div>
|
||||||
<div>
|
<div>
|
||||||
<div class="sm:hidden text-xs font-bold text-slate-500 uppercase tracking-wider mb-1">Close
|
<div class="sm:hidden text-xs font-bold text-slate-500 uppercase tracking-wider mb-1">Last Price</div>
|
||||||
Price</div>
|
|
||||||
<div class="text-lg font-semibold text-white">
|
<div class="text-lg font-semibold text-white">
|
||||||
{% if ticker.closing_price %}<a
|
{% if ticker.closing_price %}<a
|
||||||
href="https://finance.yahoo.com/quote/{{ ticker.symbol }}" target="_blank"
|
href="https://www.marketwatch.com/investing/stock/{{ ticker.symbol }}" target="_blank"
|
||||||
class="hover:text-blue-400 transition-colors">${{
|
class="hover:text-blue-400 transition-colors">${{
|
||||||
"%.2f"|format(ticker.closing_price) }}</a>
|
"%.2f"|format(ticker.closing_price) }}</a>
|
||||||
{% else %}N/A{% endif %}
|
{% else %}N/A{% endif %}
|
||||||
|
Reference in New Issue
Block a user