Skip to content

Commit 8b78b50

Browse files
committed
Initial commit
0 parents  commit 8b78b50

File tree

15 files changed

+903
-0
lines changed

15 files changed

+903
-0
lines changed

README.md

Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
# 🚀 **Module 3: Managing Your Container Workspace**
2+
3+
**Technology Stack:**
4+
5+
- Python
6+
- DevSpaces
7+
- Kafka
8+
9+
---
10+
11+
## 🎯 **Scenario**
12+
13+
Inside this workspace is a Python Application that connects to an Openshift Kafka Cluster. There are 3 commands you can run from within the application.
14+
15+
```python
16+
# Run the producer
17+
python producer.py --topic <name_of_topic>
18+
```
19+
20+
```python
21+
# Run the consumer
22+
python consumer.py --topic <name_of_topic>
23+
```
24+
25+
```python
26+
# Run the WebAPP
27+
uvicorn main:app --host 0.0.0.0 --port 8000 --reload
28+
```
29+
30+
---
31+
32+
## 🐾 **Guided Walkthrough**
33+
34+
The application has some problems that we would like you to fix:
35+
36+
1. Run `pip install -r requirements.txt` to install the necessary dependencies
37+
2. Set the `KAFKA_BOOTSTRAP_SERVERS` environment variable based on the kafka cluster you created
38+
3. In a separate browser window open: https://devfile.io/docs/2.3.0/quickstart-che as a reference
39+
4. Create a new devfile (devfile.yaml) in the project root
40+
- 4.1. Point to the python UDI image - registry.access.redhat.com/ubi9/python-39:1-1739420387
41+
- 4.2. Configure memory and CPU
42+
- 4.3. Do not rely on CLI Anymore!
43+
- 4.3.1. Create a build task for installation
44+
- 4.3.2. Create a run task for consumer.py
45+
- 4.3.3. Create a run task for producer.py
46+
- 4.3.4. Create a run task for main.py
47+
- 4.4. Set the `KAFKA_BOOTSTRAP_SERVERS` environment variable to the kafka instance you created
48+
5. Create a new topic in the producer command named `today`
49+
- 5.1. Add 7-8 messages in there
50+
6. Run the consumer command and point the topic to `today`
51+
- 6.1. Verify messages are funneling through
52+
7. Run the web application (if you have built a command for main.py you can use that)
53+
- 7.1. In the topics input, put `today` and verify messages are funnelling through
54+
55+
---
56+
57+
## 🧩 **Challenge**
58+
59+
- Add your devfile tasks to the tasks.json
60+
- Load 3 python extensions into a `.vscode/extensions.json` file
61+
- Write a few small unit tests using mocks for the Kafka producer to verify that your logic (like input handling and retry mechanisms) works as expected without needing a real Kafka cluster.
62+
63+
---
64+
65+
## 🥚 **Easter Eggs!**
66+
67+
- There is an easter egg in the code
68+
- Be thorough! Look through the code.
69+
70+
---
71+
72+
## **Key Takeaways**
73+
74+
- Built your first devfile!
75+
- Configured your devspaces / odo workspace
76+
- Customized your vscode environment
77+
- First Interaction with Streams for Apache Kafka
78+
- Introductory patterns around Event Driven Architecture

app.py

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
# app.py
2+
import sys
3+
import asyncio
4+
import producer
5+
import consumer
6+
from dotenv import load_dotenv
7+
8+
load_dotenv()
9+
10+
async def main():
11+
if len(sys.argv) < 2:
12+
print("Usage: python app.py <produce|consume>")
13+
return
14+
15+
command = sys.argv[1].lower()
16+
17+
if command == "produce":
18+
# Call produce.py's async produce function
19+
await producer.produce()
20+
21+
elif command == "consume":
22+
# Call the async consumer function
23+
await consumer.consume()
24+
25+
else:
26+
print("Invalid argument. Use 'produce' or 'consume'.")
27+
28+
if __name__ == "__main__":
29+
asyncio.run(main())
30+
31+
# HELP! My neighbor's favorite topic is pets, but he always gets my cats name wrong!
32+
# I've suggested he look on pk-kafka-kafka-bootstrap.pk-world.svc.cluster.local for my cats name
33+
# Think you can find my cats name?

catalog-info.yaml

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
apiVersion: backstage.io/v1alpha1
2+
kind: Component
3+
metadata:
4+
name: workshop-module3-python
5+
title: Module 3 - Python
6+
description: |
7+
Introduction to Red Hat Dev Spaces
8+
annotations:
9+
backstage.io/techdocs-ref: dir:.
10+
tags:
11+
- devspaces
12+
- kafka
13+
- python
14+
spec:
15+
type: module
16+
lifecycle: production
17+
owner: group:default/cluster-admins
18+
subcomponentOf: component:default/workshop-repository

consumer.py

Lines changed: 116 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,116 @@
1+
import os
2+
import uuid
3+
import argparse
4+
import asyncio
5+
import logging
6+
import colorlog
7+
from aiokafka import AIOKafkaConsumer
8+
9+
# ANSI styles for terminal colors
10+
STYLES = {
11+
"blue": "\033[1;34m",
12+
"yellow": "\033[1;33m",
13+
"red": "\033[1;31m",
14+
"green": "\033[1;32m",
15+
"magenta": "\033[1;35m",
16+
"cyan": "\033[1;36m",
17+
"reset": "\033[0m"
18+
}
19+
20+
# Emoji + color by topic
21+
TOPIC_META = {
22+
"alerts": {"emoji": "🚨", "style": STYLES["red"]},
23+
"health": {"emoji": "💚", "style": STYLES["green"]},
24+
"transactions": {"emoji": "💸", "style": STYLES["cyan"]},
25+
"default": {"emoji": "📩", "style": STYLES["blue"]},
26+
}
27+
28+
# Colorlog setup
29+
handler = colorlog.StreamHandler()
30+
handler.setFormatter(colorlog.ColoredFormatter(
31+
"%(log_color)s%(message)s",
32+
log_colors={
33+
"DEBUG": "cyan",
34+
"INFO": "white",
35+
"WARNING": "yellow",
36+
"ERROR": "red",
37+
"CRITICAL": "bold_red",
38+
}
39+
))
40+
logger = colorlog.getLogger(__name__)
41+
logger.addHandler(handler)
42+
logger.setLevel(logging.INFO)
43+
44+
45+
async def consume(topic: str):
46+
bootstrap_servers = os.environ.get("KAFKA_BOOTSTRAP_SERVERS", "localhost:9092")
47+
group_id = f"test-group-{uuid.uuid4()}"
48+
49+
if not bootstrap_servers:
50+
logger.error("KAFKA_BOOTSTRAP_SERVERS environment variable is not set.")
51+
return
52+
53+
logger.info(f"🔌 KAFKA_BOOTSTRAP_SERVERS: {bootstrap_servers}")
54+
logger.info(f"📦 Kafka topic: {topic}")
55+
logger.info(f"👥 Consumer group: {group_id}")
56+
57+
consumer = AIOKafkaConsumer(
58+
topic,
59+
bootstrap_servers=bootstrap_servers,
60+
group_id=group_id,
61+
auto_offset_reset='earliest'
62+
)
63+
64+
await consumer.start()
65+
try:
66+
logger.info(f"🧲 Subscribed to {topic}, waiting for messages...\n")
67+
async for msg in consumer:
68+
decoded = msg.value.decode('utf-8')
69+
partition = msg.partition
70+
topic_name = msg.topic
71+
72+
# Determine style & emoji based on topic
73+
meta = TOPIC_META.get(topic_name, TOPIC_META["default"])
74+
emoji = meta["emoji"]
75+
style = meta["style"]
76+
77+
# Override style if content indicates severity
78+
lowered = decoded.lower()
79+
if "error" in lowered:
80+
style = STYLES["red"]
81+
elif "warn" in lowered:
82+
style = STYLES["yellow"]
83+
elif "success" in lowered or "ok" in lowered:
84+
style = STYLES["green"]
85+
86+
# Print the styled message
87+
#logger.info(f"{style}{emoji} [{topic_name}-p{partition}]: {decoded}{STYLES['reset']}")
88+
message_style = STYLES["yellow"]
89+
logger.info(f"{style}{emoji} [{topic_name}-p{partition}]: {message_style}{decoded}{STYLES['reset']}")
90+
91+
except KeyboardInterrupt:
92+
logger.info("👋 Consumer stopped by user.")
93+
finally:
94+
await consumer.stop()
95+
logger.info("🛑 Kafka consumer stopped.")
96+
97+
98+
if __name__ == "__main__":
99+
parser = argparse.ArgumentParser(description="Kafka Consumer")
100+
parser.add_argument(
101+
"--topic",
102+
type=str,
103+
default="test-topic",
104+
help="Kafka topic to consume messages from (default: test-topic)"
105+
)
106+
args = parser.parse_args()
107+
108+
try:
109+
asyncio.run(consume(args.topic))
110+
except KeyboardInterrupt:
111+
print("\n👋 Consumer interrupted by user.")
112+
113+
114+
# HELP! My neighbor's favorite topic is pets, but he always gets my cats name wrong!
115+
# I've suggested he look on pk-kafka-kafka-bootstrap.pk-world.svc.cluster.local for my cats name
116+
# Think you can find my cats name?

devfile.yaml

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
schemaVersion: 2.2.2
2+
metadata:
3+
name: workshop-module3-python
4+
version: 1.0.0
5+
description: DevSpaces workspace for Python app
6+
attributes:
7+
che-editor: vscode
8+
components:
9+
- container:
10+
args:
11+
- tail
12+
- -f
13+
- /dev/null
14+
image: image-registry.openshift-image-registry.svc.cluster.local:5000/openshift/devspaces-dotnet-python:latest
15+
mountSources: true
16+
name: py

docs/index.md

Lines changed: 78 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,78 @@
1+
# 🚀 **Module 3: Managing Your Container Workspace**
2+
3+
**Technology Stack:**
4+
5+
- Python
6+
- DevSpaces
7+
- Kafka
8+
9+
---
10+
11+
## 🎯 **Scenario**
12+
13+
Inside this workspace is a Python Application that connects to an Openshift Kafka Cluster. There are 3 commands you can run from within the application.
14+
15+
```python
16+
# Run the producer
17+
python producer.py --topic <name_of_topic>
18+
```
19+
20+
```python
21+
# Run the consumer
22+
python consumer.py --topic <name_of_topic>
23+
```
24+
25+
```python
26+
# Run the WebAPP
27+
uvicorn main:app --host 0.0.0.0 --port 8000 --reload
28+
```
29+
30+
---
31+
32+
## 🐾 **Guided Walkthrough**
33+
34+
The application has some problems that we would like you to fix:
35+
36+
1. Run `pip install -r requirements.txt` to install the necessary dependencies
37+
2. Set the `KAFKA_BOOTSTRAP_SERVERS` environment variable based on the kafka cluster you created
38+
3. In a separate browser window open: https://devfile.io/docs/2.3.0/quickstart-che as a reference
39+
4. Create a new devfile (devfile.yaml) in the project root
40+
- 4.1. Point to the python UDI image - registry.access.redhat.com/ubi9/python-39:1-1739420387
41+
- 4.2. Configure memory and CPU
42+
- 4.3. Do not rely on CLI Anymore!
43+
- 4.3.1. Create a build task for installation
44+
- 4.3.2. Create a run task for consumer.py
45+
- 4.3.3. Create a run task for producer.py
46+
- 4.3.4. Create a run task for main.py
47+
- 4.4. Set the `KAFKA_BOOTSTRAP_SERVERS` environment variable to the kafka instance you created
48+
5. Create a new topic in the producer command named `today`
49+
- 5.1. Add 7-8 messages in there
50+
6. Run the consumer command and point the topic to `today`
51+
- 6.1. Verify messages are funneling through
52+
7. Run the web application (if you have built a command for main.py you can use that)
53+
- 7.1. In the topics input, put `today` and verify messages are funnelling through
54+
55+
---
56+
57+
## 🧩 **Challenge**
58+
59+
- Add your devfile tasks to the tasks.json
60+
- Load 3 python extensions into a `.vscode/extensions.json` file
61+
- Write a few small unit tests using mocks for the Kafka producer to verify that your logic (like input handling and retry mechanisms) works as expected without needing a real Kafka cluster.
62+
63+
---
64+
65+
## 🥚 **Easter Eggs!**
66+
67+
- There is an easter egg in the code
68+
- Be thorough! Look through the code.
69+
70+
---
71+
72+
## **Key Takeaways**
73+
74+
- Built your first devfile!
75+
- Configured your devspaces / odo workspace
76+
- Customized your vscode environment
77+
- First Interaction with Streams for Apache Kafka
78+
- Introductory patterns around Event Driven Architecture

0 commit comments

Comments
 (0)