Update README.
This commit is contained in:
151
README.md
151
README.md
@@ -1,154 +1,3 @@
|
||||
# rstat - Reddit Stock Analyzer
|
||||
|
||||
A powerful, installable command-line tool and web dashboard to scan Reddit for stock ticker mentions, perform sentiment analysis, generate insightful reports, and create shareable summary images.
|
||||
|
||||
## Key Features
|
||||
|
||||
* **Dual-Interface:** Use a flexible command-line tool (`rstat`) for data collection and a simple web dashboard (`rstat-dashboard`) for data visualization.
|
||||
* **Flexible Data Scraping:**
|
||||
* Scan subreddits from a config file or target a single subreddit on the fly.
|
||||
* Configure the time window to scan posts from the last 24 hours (for daily cron jobs) or back-fill data from several past days (e.g., last 7 days).
|
||||
* Fetches from `/new` to capture the most recent discussions.
|
||||
* **Deep Analysis & Storage:**
|
||||
* Scans both post titles and comments, differentiating between the two.
|
||||
* Performs a "deep dive" analysis on posts to calculate the average sentiment of the entire comment section.
|
||||
* Persists all data in a local SQLite database (`reddit_stocks.db`) to track trends over time.
|
||||
* **Rich Data Enrichment:**
|
||||
* Calculates sentiment (Bullish, Bearish, Neutral) for every mention using NLTK.
|
||||
* Fetches and stores daily closing prices and market capitalization from Yahoo Finance.
|
||||
* **Interactive Web Dashboard:**
|
||||
* View Top 10 tickers across all subreddits or on a per-subreddit basis.
|
||||
* Click any ticker to get a "Deep Dive" page, showing every post it was mentioned in.
|
||||
* **Shareable Summary Images:**
|
||||
* Generate clean, dark-mode summary images for both daily and weekly sentiment for any subreddit, perfect for sharing.
|
||||
* **High-Quality Data:**
|
||||
* Uses a configurable blacklist and smart filtering to reduce false positives.
|
||||
* Automatically cleans the database of invalid tickers if the blacklist is updated.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
reddit_stock_analyzer/
|
||||
├── .env # Your secret API keys
|
||||
├── requirements.txt # Project dependencies
|
||||
├── setup.py # Installation script for the tool
|
||||
├── subreddits.json # Default list of subreddits to scan
|
||||
├── templates/ # HTML templates for the web dashboard
|
||||
│ ├── base.html
|
||||
│ ├── index.html
|
||||
│ ├── subreddit.html
|
||||
│ ├── deep_dive.html
|
||||
│ ├── image_view.html
|
||||
│ └── weekly_image_view.html
|
||||
└── rstat_tool/ # The main source code package
|
||||
├── __init__.py
|
||||
├── main.py # Scraper entry point and CLI logic
|
||||
├── dashboard.py # Web dashboard entry point (Flask app)
|
||||
├── database.py # All SQLite database functions
|
||||
└── ...
|
||||
```
|
||||
|
||||
## Setup and Installation
|
||||
|
||||
Follow these steps to set up the project on your local machine.
|
||||
|
||||
### 1. Prerequisites
|
||||
* Python 3.7+
|
||||
* Git
|
||||
|
||||
### 2. Clone the Repository
|
||||
```bash
|
||||
git clone <your-repository-url>
|
||||
cd reddit_stock_analyzer
|
||||
```
|
||||
|
||||
### 3. Set Up a Python Virtual Environment
|
||||
It is highly recommended to use a virtual environment to manage dependencies.
|
||||
|
||||
**On macOS / Linux:**
|
||||
```bash
|
||||
python3 -m venv .venv
|
||||
source .venv/bin/activate
|
||||
```
|
||||
|
||||
**On Windows:**
|
||||
```bash
|
||||
python -m venv .venv
|
||||
.\.venv\Scripts\activate
|
||||
```
|
||||
|
||||
### 4. Install Dependencies
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 5. Configure Reddit API Credentials
|
||||
1. Go to the [Reddit Apps preferences page](https://www.reddit.com/prefs/apps) and create a new "script" app.
|
||||
2. Create a file named `.env` in the root of the project directory.
|
||||
3. Add your credentials to the `.env` file like this:
|
||||
|
||||
```
|
||||
REDDIT_CLIENT_ID=your_client_id_from_reddit
|
||||
REDDIT_CLIENT_SECRET=your_client_secret_from_reddit
|
||||
REDDIT_USER_AGENT=A custom user agent string (e.g., python:rstat:v1.2)
|
||||
```
|
||||
|
||||
### 6. Set Up NLTK
|
||||
Run the included setup script **once** to download the required `vader_lexicon` for sentiment analysis.
|
||||
```bash
|
||||
python rstat_tool/setup_nltk.py
|
||||
```
|
||||
|
||||
### 7. Set Up Playwright
|
||||
Run the install routine for playwright. You might need to install some dependencies. Follow on-screen instruction if that's the case.
|
||||
```bash
|
||||
playwright install
|
||||
```
|
||||
|
||||
### 8. Build and Install the Commands
|
||||
Install the tool in "editable" mode. This creates the `rstat` and `rstat-dashboard` commands in your virtual environment and links them to your source code.
|
||||
|
||||
```bash
|
||||
pip install -e .
|
||||
```
|
||||
The installation is now complete.
|
||||
|
||||
---
|
||||
|
||||
## Usage
|
||||
|
||||
The tool is split into two commands: one for gathering data and one for viewing it.
|
||||
|
||||
### 1. The Scraper (`rstat`)
|
||||
|
||||
This is the command-line tool you will use to populate the database. It is highly flexible.
|
||||
|
||||
**Common Commands:**
|
||||
|
||||
* **Run a daily scan (for cron jobs):** Scans subreddits from `subreddits.json` for posts in the last 24 hours.
|
||||
```bash
|
||||
rstat --config subreddits.json --days 1
|
||||
```
|
||||
|
||||
* **Scan a single subreddit:** Ignores the config file and scans just one subreddit.
|
||||
```bash
|
||||
rstat --subreddit wallstreetbets --days 1
|
||||
```
|
||||
|
||||
* **Back-fill data for last week:** Scans a specific subreddit for all new posts in the last 7 days.
|
||||
```bash
|
||||
rstat --subreddit Tollbugatabets --days 7
|
||||
```
|
||||
|
||||
* **Get help and see all options:**
|
||||
```bash
|
||||
rstat --help
|
||||
```
|
||||
|
||||
### 2. The Web Dashboard (`rstat-dashboard`)
|
||||
|
||||
This command starts a local web server to let you explore the data you've collected.
|
||||
|
||||
<div align="center">
|
||||
|
||||
# RSTAT — Reddit Stock Analyzer
|
||||
|
Reference in New Issue
Block a user