This is a FastAPI backend for fetching and summarizing news articles using Google News and AI models like OpenAI's GPT and GeminiAI.
- Fetch top news articles using Google News.
- Summarize articles using OpenAI GPT-3 or GeminiAI.
- Caching of fetched articles and summaries to improve performance.
- Asynchronous operations to handle multiple requests efficiently.
- Defined in requirements.txt
-
Clone the repository:
git clone https://github.com/marcodestefano/newsfetcher.git cd newsfetcher
-
Create a virtual environment and activate it:
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install the required dependencies:
pip install -r requirements.txt
-
Create a
.env
file in the root directory and add your configuration:PORT=8000 TIMEOUT=43200 # 12 hours in seconds ARTICLES=5 MAX_ARTICLES=15 LANGUAGE=it OPENAI_API_KEY=your_openai_api_key GEMINI_API_KEY=your_gemini_api_key
Run the FastAPI server:
fastapi dev main.py
- URL:
/
- Method:
GET
- Response: JSON object with status "OK".
- URL:
/news
- Method:
POST
- Parameters:
num
(int): Number of articles to fetch. Default is 5.language
(str): Language of the news. Default is 'it'.
- Response: JSON array of articles with "url" and "title".
- URL:
/article
- Method:
POST
- Parameters:
url
(str): URL of the article to fetch.ai
(str): AI service to use ('openai' or 'gemini'). Default isNone
. If ai is not defined, the article is not summarized.model
(str): Model to use with the AI service. Default models are used if not specified: gpt-3.5-turbo-0125 for openai and gemini-1.5-flash-latest for geminiaikey
(str): API key for the AI service. Default isNone
.
- Response: JSON object with "title" and "content".
Using Postman or any HTTP client, send a POST request to /news
:
{
"num": 1,
"language": "en"
}
And to /article
:
{
"url": "https://example.com/article",
"ai": "openai",
"model": "gpt-3.5-turbo",
"aikey": "your_openai_api_key"
}
The app is ready for deploy on Vercel.
Contributions are welcome, please create a pull request with a description of your changes.
This project is licensed under the Apache 2.0 License.