Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature #65: track metrics of failed tests #82

Open
wants to merge 35 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
22ddcd0
fix(79): Ensure test supposed to fail also fail, when monitoring is
May 8, 2024
7f679d7
Merge branch 'master' into fix/79-no-monitor
lhpt2 May 8, 2024
4a8c5d9
Add vscode config to debug tests
May 14, 2024
b56ffb7
Refactor: Move profiling logic from unmaintained memory-profiler module
May 14, 2024
d74320b
Refactor: Add docstring for retval in memory_usage func
May 14, 2024
e63ca6c
Refactor profiler.py: Simplify functions and MemTimer Class to only
May 14, 2024
50c3f3a
License and Copyright notice profiler.py: Add memory_profiler copyright
May 14, 2024
ca5b209
Refactor (profiler, pytest_monitor, session): Adjust memory_usage
May 14, 2024
ba71d58
Refactor (handler.py, pytest_monitor.py, session.py): Add table
May 14, 2024
c5974c9
Refactor (handler.py, pytest_monitor.py, session.py): Add table
May 14, 2024
ab3a1b1
Merge branch 'feature/track-metrics-of-failed' of github.com:einhunde…
May 17, 2024
c8b14f5
Merge branch 'feature/track-metrics-of-failed' of github.com:einhunde…
May 17, 2024
24555f0
Merge branch 'feature/track-metrics-of-failed' of github.com:einhunde…
May 17, 2024
6fbcfb2
Refactor check_create_test_passed_column to use a in memory db
May 21, 2024
a0990be
test_monitor.py: Add test to check for proper setup of new database c…
May 21, 2024
4d57393
Refactor test_monitor.py: Move db handler tests into own test source
May 28, 2024
6985c72
Refactor test_monitor_handler.py: Rename handler test functions to
May 28, 2024
57d56b0
Add support for Bitbucket CI: Includes function to generate description
Jun 7, 2024
b83b5f3
Changes: Update the sqlite test functions to the newest versions
Jun 10, 2024
9ed6212
Merge branch 'master' into feature/track-metrics-of-failed
Jun 13, 2024
17d1c1e
Remove memory_profiler dependency (profiler now inside own module)
Jun 13, 2024
69f5020
Fix: Let profiler.py only kill the recently created MemTimer when a t…
Jun 18, 2024
a0a55b0
Add cmd flag to disabling monitoring for failed tests.
Jun 18, 2024
e4ed1e4
Add docstrings to tests and add a basic sqlite handler test.
Jun 18, 2024
ba45b42
Profiler.py: Add hint explaining kill of only recent MemTimer
Jun 18, 2024
39d573d
Update changelog for feature #65
Jun 18, 2024
e7e3588
Update documentation to explain new feature/changes.
Jun 18, 2024
59461b1
Fix issues reported by Flake8 and Ruff
Jun 18, 2024
64cd843
Fix: Add backwards compatible type annotation in profiler.py
Jun 18, 2024
aec9da9
Fix: Remove unneeded pytester fixture in test_monitor_handler.py
Jun 18, 2024
aeadf18
Update AUTHORS file.
Jun 18, 2024
b2b437a
Changes to fix, raise exception inside wrapped_function(), avoid issues
Jul 3, 2024
d831b4f
Minor fix: Don't raise e, just use raise instead (no var needed here)
Jul 3, 2024
69391ff
Merge branch 'fix/79-no-monitor' into feature/track-metrics-of-failed
Jul 3, 2024
3a0632e
Fix: Catch BaseException instead of Exception in profiler.py
Jul 9, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
{
"python.testing.pytestArgs": [
"tests"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}
1 change: 1 addition & 0 deletions AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -4,3 +4,4 @@ Contributors include:
- Raymond Gauthier (jraygauthier) added Python 3.5 support.
- Kyle Altendorf (altendky) fixed bugs on session teardown
- Hannes Engelhardt (veritogen) added Bitbucket CI support.
- Lucas Haupt (lhpt2) added profiler.py and track failed test support.
2 changes: 2 additions & 0 deletions docs/sources/changelog.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ Changelog
=========

* :release:`to be discussed`
* :feature: `#65` Also monitor failed test as default and add flag ``--no-failed`` to turn monitoring failed tests off.
* :bug: `#79` Fix a bug concerning commandline flag ``--no-monitor`` causing tests that are supposed to fail to pass instead
* :feature:`#75` Automatically gather CI build information for Bitbucket CI.

* :release:`1.6.6 <2023-05-06>`
Expand Down
8 changes: 8 additions & 0 deletions docs/sources/configuration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,14 @@ Disable monitoring
If you need for some reason to disable the monitoring, pass the *\-\-no-monitor* option.


Disable failed tests
--------------------

By default failing tests are monitored in the database. The database has an additional column that
indicates if a test passed (boolean value). If you only need to monitor successful tests, pass
the *\-\-no-failed* option.


Describing a run
----------------

Expand Down
2 changes: 2 additions & 0 deletions docs/sources/operating.rst
Original file line number Diff line number Diff line change
Expand Up @@ -129,5 +129,7 @@ CPU_USAGE (FLOAT)
System-wide CPU usage as a percentage (100 % is equivalent to one core).
MEM_USAGE (FLOAT)
Maximum resident memory used during the test execution (in megabytes).
TEST_PASSED (BOOLEAN)
Boolean Value indicating if a test passed.

In the local database, these Metrics are stored in table `TEST_METRICS`.
3 changes: 2 additions & 1 deletion docs/sources/remote.rst
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,8 @@ POST /metrics/
user_time: float,
kernel_time: float,
cpu_usage: float,
mem_usage: float
mem_usage: float,
passed: bool,
}

**Return Codes**: Must return *201* (*CREATED*) if the **Metrics** has been created
126 changes: 79 additions & 47 deletions pytest_monitor/handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,18 +6,41 @@ def __init__(self, db_path):
self.__db = db_path
self.__cnx = sqlite3.connect(self.__db) if db_path else None
self.prepare()
# check if new table column is existent, if not create it
self.check_create_test_passed_column()

def close(self):
self.__cnx.close()

def __del__(self):
self.__cnx.close()

def check_create_test_passed_column(self):
cursor = self.__cnx.cursor()
# check for test_passed column,
# table exists bc call happens after prepare()
cursor.execute("PRAGMA table_info(TEST_METRICS)")
has_test_column = any(
column[1] == "TEST_PASSED" for column in cursor.fetchall()
)
if not has_test_column:
cursor.execute(
"ALTER TABLE TEST_METRICS ADD COLUMN TEST_PASSED BOOLEAN DEFAULT TRUE;"
)
self.__cnx.commit()

def query(self, what, bind_to, many=False):
cursor = self.__cnx.cursor()
cursor.execute(what, bind_to)
return cursor.fetchall() if many else cursor.fetchone()

def insert_session(self, h, run_date, scm_id, description):
with self.__cnx:
self.__cnx.execute(
"insert into TEST_SESSIONS(SESSION_H, RUN_DATE, SCM_ID, RUN_DESCRIPTION)" " values (?,?,?,?)",
(h, run_date, scm_id, description),
)
self.__cnx.execute(
"insert into TEST_SESSIONS(SESSION_H, RUN_DATE, SCM_ID, RUN_DESCRIPTION)"
" values (?,?,?,?)",
(h, run_date, scm_id, description),
)
self.__cnx.commit()

def insert_metric(
self,
Expand All @@ -35,51 +58,53 @@ def insert_metric(
kernel_time,
cpu_usage,
mem_usage,
passed: bool,
):
with self.__cnx:
self.__cnx.execute(
"insert into TEST_METRICS(SESSION_H,ENV_H,ITEM_START_TIME,ITEM,"
"ITEM_PATH,ITEM_VARIANT,ITEM_FS_LOC,KIND,COMPONENT,TOTAL_TIME,"
"USER_TIME,KERNEL_TIME,CPU_USAGE,MEM_USAGE) "
"values (?,?,?,?,?,?,?,?,?,?,?,?,?,?)",
(
session_id,
env_id,
item_start_date,
item,
item_path,
item_variant,
item_loc,
kind,
component,
total_time,
user_time,
kernel_time,
cpu_usage,
mem_usage,
),
)
self.__cnx.execute(
"insert into TEST_METRICS(SESSION_H,ENV_H,ITEM_START_TIME,ITEM,"
"ITEM_PATH,ITEM_VARIANT,ITEM_FS_LOC,KIND,COMPONENT,TOTAL_TIME,"
"USER_TIME,KERNEL_TIME,CPU_USAGE,MEM_USAGE,TEST_PASSED) "
"values (?,?,?,?,?,?,?,?,?,?,?,?,?,?,?)",
(
session_id,
env_id,
item_start_date,
item,
item_path,
item_variant,
item_loc,
kind,
component,
total_time,
user_time,
kernel_time,
cpu_usage,
mem_usage,
passed,
),
)
self.__cnx.commit()

def insert_execution_context(self, exc_context):
with self.__cnx:
self.__cnx.execute(
"insert into EXECUTION_CONTEXTS(CPU_COUNT,CPU_FREQUENCY_MHZ,CPU_TYPE,CPU_VENDOR,"
"RAM_TOTAL_MB,MACHINE_NODE,MACHINE_TYPE,MACHINE_ARCH,SYSTEM_INFO,"
"PYTHON_INFO,ENV_H) values (?,?,?,?,?,?,?,?,?,?,?)",
(
exc_context.cpu_count,
exc_context.cpu_frequency,
exc_context.cpu_type,
exc_context.cpu_vendor,
exc_context.ram_total,
exc_context.fqdn,
exc_context.machine,
exc_context.architecture,
exc_context.system_info,
exc_context.python_info,
exc_context.compute_hash(),
),
)
self.__cnx.execute(
"insert into EXECUTION_CONTEXTS(CPU_COUNT,CPU_FREQUENCY_MHZ,CPU_TYPE,CPU_VENDOR,"
"RAM_TOTAL_MB,MACHINE_NODE,MACHINE_TYPE,MACHINE_ARCH,SYSTEM_INFO,"
"PYTHON_INFO,ENV_H) values (?,?,?,?,?,?,?,?,?,?,?)",
(
exc_context.cpu_count,
exc_context.cpu_frequency,
exc_context.cpu_type,
exc_context.cpu_vendor,
exc_context.ram_total,
exc_context.fqdn,
exc_context.machine,
exc_context.architecture,
exc_context.system_info,
exc_context.python_info,
exc_context.compute_hash(),
),
)
self.__cnx.commit()

def prepare(self):
cursor = self.__cnx.cursor()
Expand Down Expand Up @@ -109,6 +134,7 @@ def prepare(self):
KERNEL_TIME float, -- time spent in kernel space
CPU_USAGE float, -- cpu usage
MEM_USAGE float, -- Max resident memory used.
TEST_PASSED boolean, -- boolean indicating if test passed
FOREIGN KEY (ENV_H) REFERENCES EXECUTION_CONTEXTS(ENV_H),
FOREIGN KEY (SESSION_H) REFERENCES TEST_SESSIONS(SESSION_H)
);"""
Expand All @@ -131,3 +157,9 @@ def prepare(self):
"""
)
self.__cnx.commit()

def get_env_id(self, env_hash):
query_result = self.query(
"SELECT ENV_H FROM EXECUTION_CONTEXTS WHERE ENV_H= ?", (env_hash,)
)
return query_result[0] if query_result else None
Loading