Skip to content

Commit fa4671b

Browse files
committed
update readme
1 parent c001226 commit fa4671b

File tree

2 files changed

+9
-39
lines changed

2 files changed

+9
-39
lines changed

README.md

+8-39
Original file line numberDiff line numberDiff line change
@@ -15,50 +15,19 @@ The journey of a thousand miles begins with a single commit. Happy coding!
1515

1616
* Summaries: AI summarizes your entries, your week, your year.
1717
* Themes: AI shows your recurring themes & issues. Also valuable for dream themes.
18+
* Prompt: chat with your journal (TODO integrate local language models)
1819
* Books: AI recommends self-help books based on your entries.
19-
* Security: All text is industry-standard encrypted.
20+
* Security: industry best practices
2021
* Field Tracking (lots to be done here): Track fields (mood, sleep, substance intake, etc). AI shows you how they interact and which ones to focus on.
2122
* Share (coming soon): Share journals with therapists, who can use all these tools to catch up since your last session.
2223
* Questions (coming soon): Ask AI anything about yourself. The answers and insights may surprise you.
2324

2425
# Setup
25-
Currently very hairy, will clean this up soon.
26+
This is an SST site (CDK, AWS). Which means even local development runs against an AWS stack. Normally that's awesome and cheap, but for Gnothi there's a VPC for security (private subnets and a NAT gateway), which is $30/mo min; and Aurora Serverless v2 Postgres, which is $40/mo min. So I need to figure out how to Dockerize the dev requirements for localhost development. In the mean time, if you see an opportunity for bugs/features in the code, take a stab at it and I'll integreate on my end. I'll beef up this README when I can get a viable local dev setup.
2627

27-
### Essentials
28-
* Install Postgres. Currently not using Docker, as I'm constantly pruning and I want to keep my data between sessions (and use the same SQL hosts for other projects).
29-
* `cp common/config.example.json common/config.json` and modify
30-
* Install Docker & docker-compose
28+
### Steps
3129
* `docker-compose up -d`
32-
* If you get errors with `gpu-dev`, try `docker-compose up -d client && docker-compose up -d server` (then see section below)
33-
34-
### To use AI
35-
* The client & server should run without the AI stuff, for a while, but you'll want to get this working eventually.
36-
* Libgen
37-
* Quickstart by extracting https://gnothiai.com/libgens.zip to /storage/libgen/*. Each file starts with <ENVIRONMENT>, so replace with "testing" or "development" or such.
38-
* If you're not interested in books development, you can stop now. Below is how to generate those libgen files.
39-
* Install MySQL server on your host
40-
* Download [libgen/dbdumps/libgen.rar](http://gen.lib.rus.ec/dbdumps/), extract, improt into MySQL
41-
* Modify `common/config.json` for MySQL/libgen
42-
* Install Docker & docker-compose [with GPU support](https://ocdevel.com/blog/20201207-wsl2-gpu-docker)
43-
* `docker-compose up -d`
44-
45-
I'll be developing lefnire/ml-tools actively along with Gnothi, so on my machine it's setup like:
46-
47-
```
48-
git clone https://github.com/lefnire/ml-tools.git # might need to delete that folder first, if docker-compose created it
49-
docker-compose exec gpu-dev
50-
$ pip install -e /ml-tools
51-
```
52-
53-
## Tests
54-
First test GPU, which sets up fixtures. Then test server. Client tests sorely needed!
55-
56-
```
57-
docker-compose exec gpu-dev bash
58-
$ pytest tests -svv
59-
$ # once it's done, you want to run it in another tab for your server tests
60-
$ ENVIRONMENT=testing python app/run.py
61-
62-
docker-compose exec server bash
63-
$ pytest tests -svv
64-
```
30+
* `cp .env .env.shared-prod` -> modify with your email
31+
* `cp .env .env.dev` -> modify with your email
32+
* `AWS_PROFILE=<your profile> npm start`
33+
* `cd web && npm start`

services/ml/python/.gitignore

+1
Original file line numberDiff line numberDiff line change
@@ -8,3 +8,4 @@ books/db
88
mock_entries.json
99
haystack
1010
**/*.parquet
11+
*.ipynb

0 commit comments

Comments
 (0)