Since the training was done using Google colab and models trained are saved on google drive, you will have to give drive access to notebooks.
Create a folder named Stackoverflow_VS_extension
.
Notebooks:
- Dataset helper: Open the link, you will have to login into kaggle account and then click on copy and edit option to open the notebook and run all the cells.
After downloading the dataset place it in Stackoverflow_VS_extension
folder in drive. Then execute the following notebooks.
STEP1:
Download Preprocessed_data.csv, Tag_predictor_weights.hs, tokenizer.pickel, SO_word2vec_embeddings.bin, title_embeddings.csv from Stackoverflow_VS_extension
folder in google drive created while training or directly use this link. Save all these files to /models directory.
STEP2:
- cd extension/stackoverflowextension/
- npm install
STEP 3:
Server setup:
- cd extension/stackoverflowextension/python_script/
- python3 -m venv .
- pip3 install -r requirements.txt
- python3 server.py
- Open link. This will setup the server, wait for 1-2 minutes for Data read to appear on webpage.
STEP 4:
Following steps require you to have VS code installed.
- cd extension/stackoverflowextension
- code .
- Press F5. This will open the development window for extension.
- Press Ctrl + Shift + P and type
StackOverflow Engine
to open the extension main page. - Simply enter the query you want to esarch for and press search button.