Commit automatico: D1.2 Quality Assessment Plan (QAP)

This commit is contained in:
Michele Carraglia 2025-12-02 13:36:30 +01:00
parent 55754958aa
commit ed75669158
19 changed files with 166 additions and 0 deletions

Binary file not shown.

After

Width:  |  Height:  |  Size: 83 KiB

View File

@ -0,0 +1,166 @@
ID;Title;Metric; Service Level
For incident management
KPI1.1; Incident acknowledgement time
KPI1.2; Incident intervention time
KPI1.3; Incident Resolution time
For IT Software Development Life Cycle events
KPI1.4; Number of release back-outs; Based on the release data available; Zeroreleases
KPI1.5; Number of defects in release/change in PRD - Incidents caused by new releases; Based on the total number of new Blocking, Major, Minor defects logged in defect management tool during the period; The number of defects in a release in PRODUCTION should not exceed the following threshold; zero Blocking, one Major, five Minor
KPI1.6; Defects not found during test execution in non-PRODUCTION ; Analysis of all new Blocking, Major, Minor defects logged in defect management tool in the PRODUCTION environment which had no record for the tests conducted in nonPRODUCTION environments; No new defects should be found in PRODUCTION
KPI1.7; Test execution coverage; Total number of executed test cases by the total number of test cases planned to be executed (%). Information to be gathered from the Test Management tool; 100% of planned test cases are executed For IT Software Development Life Cycle events
KPI1.13; Implementation of updates to Configuration Management DataBase (CMDB) arising out of change management process; Based on data available in CMDB for every Configuration Item4 that required to be updated; One item per month not kept up-to-date within 10 working days of change being made
KPI1.14; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations
The contractual duties are monitored bimonthly by the Board;
KPI1.15; Compliance with the delivery date of the bi-monthly progress report; number of elapsed working days between the effective delivery date and the expected due date of the report; max. 1 working day
KPI1.16; Compliance with the expected content of the bi-monthly progress report; Number of report not meeting the content requirements; Zero non-compliance
KPI1.17; Quality of the Financial report (including invoicing, evidences, etc.) submitted to the Ministry; Number of report not meeting the content requirements; Zero noncompliance
For incident management (Same as WP1 incident mgt KPI1.1 to 1.3)
For DC services operations events
KPI2.3; System/service availability; Uptime and availability of the PRODUCTION environment and related services as reported by the monitoring systems; The minimal availability is between 95,00% and 97,00 % over 28 rolling days
KPI2.4; ibidem for the TEST environment; The minimal availability is between 90,00% and 95,00 % over 28 rolling days
KPI2.5; ibidem for the DEVELOPMENT environment; The minimal availability is between 92,00% and 95,00 % over 28 rolling days
KPI2.6; Quantity of incidents due to capacity (CPU, Memory, storage)shortages; Based on data available in ticketing tool; Zero incidents should happen because of insufficient service or component capacity
KPI2.7; Quantity of incidents having impacted the security of the ITSERR environments; 0 per month
KPI2.8; Quantity of incidents having impacted the backup of the ITSERR environments; max. 1 per month
KPI2.9; Quantity of incidents not solved within the time frame foreseen by the service level; per month 0 for CRITICAL, 1 for HIGH, 3 for MODERATE and 5 for LOW.
KPI2.10; Quantity of system maintenance operations not performed by the DCS team (i.e. security patching, etc.); max. 1 per month Deliverables quality, deadlines and services levels performance are monitored monthly by the Infrastructure Manager and the DCS Operation Leader
KPI2.11; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
KPI2.12; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
KPI2.13; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
KPI2.14; Delay in successful implementation of corrective action plan following quality audits; Number of working days of delay beyond expected implementation; zero working days
KPI2.15; # incident report due to quality issues with a deliverables; Number of deliverables not meeting the requirements defined in the DOP/QAP/SMP/CDP; max. 1 per month
KPI2.16; Service Desk reachability through communication channels during working hours; 90% of the calls or emails to the Desk are acknowledged; min. 90%
For DCS services (ITIL Life Cycle) events for the Configuration Management DataBase, the KPIs are the same as for WP1 KPI1.3 and 1.14
General;
KPI3.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
KPI3.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
CRITERION
For development phase;
KPI3.1; Feedback on CRITERION prototype testers, to be assessed within M16, then M22, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI3.2; Feedback on CRITERION prototype testers, to be assessed within M16, then M22, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
For ex-post assessment;
KPI3.3; usage assessment of CRITERION, to be assessed within 3 years after the end of the project; # of users accessing the online database.
GNORM
For development phase;
KPI3.4; efficiency of the data mining algorithm developed in A3.3, to be assessed within M20; %age of correctly extracted paratextual elements; min. 85%
KPI3.5; efficiency of the data mining algorithm developed in A3.3, to be assessed within M20; %age of incorrectly extracted paratextual elements; max. 15%
KPI3.6; efficiency of automatic properties annotation system, developed in A3.3, to be assessed within M24; %age of correctly assigned properties; min. 95%
KPI3.7; efficiency of automatic properties annotation system, developed in A3.3, to be assessed within M24; %age of incorrectly assigned properties; max. 5%
KPI3.8; Feedback from GNORM prototype testers, to be assessed within M18, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI3.9; Feedback from GNORM prototype testers, to be assessed within M18, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
For ex-post assessment;
KPI3.10; usage assessment of GNORM-powered databases (D3.3), to be assessed within 3 years after the end of the project; # of users accessing the online databases General;
KPI4.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
KPI4.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
For development phase;
KPI4.1; Efficiency of word embedding extraction tool for DaMSym; to be assessed within M12; %age of correctly extracted properties; min. 90%
KPI4.2; Efficiency of word embedding extraction tool for DaMSym; to be assessed within M12; %age of incorrectly extracted properties; max. 10%
KPI4.3; Efficiency of the machine translation tool for DaMSym; to be assessed within M1; %age of correct translations; min. 85%
KPI4.4; Efficiency of the machine translation tool for DaMSym; to be assessed within M18; %age of incorrect translations; max. 10%
KPI4.5; Efficiency of the text classification and document understanding tool for DaMSym; to be assessed within M20; %age of correct classifications; min. 85%
KPI4.6; Efficiency of the text classification and document understanding tool for DaMSym; to be assessed within M20; %age of incorrect classifications; max. 15%
For pre-release phase;
KPI4.7; Feedback on DaMSym platform (as described in D4.1.1), to be assessed within M16, then M22, then M24; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
KPI4.7; Feedback on DaMSym platform (as described in D4.1.1), to be assessed within M16, then M22, then M24; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
For ex-post assessment;
KPI4.8; usage assessment of D4.3 (the published database on the Creed powered by DaMSym), to be assessed within 3 years after the end of the project; # of users accessing the database
General;
KPI5.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
KPI5.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
For development phase;
KPI5.1; Efficacy of algorithm for automatic text recognition (D5.1.1), to be assessed within M16; %age of correctly recognised text; min.90%
KPI5.2; Efficacy of algorithm for automatic text recognition (D5.1.1), to be assessed within M16; %age of incorrectly recognised text; max.10%
KPI5.3; Efficacy of algorithm for metadata extraction (D5.1.1), to be assessed within M16; %age of correctly extracted metadata; min.90%
KPI5.4; Efficacy of algorithm for metadata extraction (D5.1.1), to be assessed within M16; %age of incorrectly extracted metadata; max. 10%
KPI5.5; Efficacy of intelligent and automated cataloguing prototype (D5.1.2), to be assessed within M20, then M24, then M28; %age of correctly catalogued; min.90%
KPI5.6; Efficacy of intelligent and automated cataloguing prototype (D5.1.2), to be assessed within M20, then M24, then M28; %age of incorrectly catalogue entries; max.90%
For ex-post assessment;
KPI5.7; integration with national bibliographic systems, to be assessed within 3 years of the project; # of libraries using DigitalMaktaba
KPI5.8; use by the research communities, to be assessed within 3 years of the project; # of users accessing DigitalMaktaba-powered catalogues
General
KPI6.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
KPI6.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
For development phase;
KPI6.1; Efficacy of the algorithm for automatic metadata extraction on visual and audiovisual data (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of correctly extracted metadata; min.90%
KPI6.2; Efficacy of the algorithm for automatic metadata extraction on visual and audiovisual data (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of incorrectly extracted metadata; max.10%
KPI6.3; Efficacy of the algorithm for layout analysis to automatically extract texts and images (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of correctly extracted texts and images; min.90%
KPI6.4; Efficacy of the algorithm for layout analysis to automatically extract texts and images (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of incorrectly extracted texts and images; max.10%
For pre-release phase;
KPI6.5; Feedback on YASMINE applied to Sanctuaria (as described in D6.1.2), to be assessed within M28; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI6.6; Feedback on YASMINE applied to Sanctuaria (as described in D6.1.2), to be assessed within M28; %age of negative feedback from a sample of at least 30 scholars; max. 15%
For ex-post assessment;
KPI6.7; usage assessment of D6.3 (the published Plorabunt and Sanctuaria databases), to be assessed within 3 years after the end of the project; # of users accessing the online database.
KPI6.8; usage assessment of D6.4 (the published YASMINE metascraper), to be assessed within 3 years after the end of the project; # of downloads
General;
KPI7.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
KPI7.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
For development phase;
KPI7.1; Efficacy of HTR/OCR tool (described in D7.1.2), to be assessed within M14; %age of correct recognition; min. 95%
KPI7.2; Efficacy of HTR/OCR tool (described in D7.1.2), to be assessed within M14; %age of wrong recognition; max. 5%
KPI7.3; Efficacy of the algorithm for linking regesta to their sources (described in D7.1.2), to be assessed within M18; %age of correct links from regesta to the source; min.90%
KPI7.4; Efficacy of the algorithm for linking regesta to their sources (described in D7.1.2), to be assessed within M18; %age of missing links from regesta to the source; max. 20%
KPI7.5; Efficacy of the algorithm for automatically producing regesta from sources (described in D7.1.2), to be assessed within M24; %age of correct summarisation of sources into meaningful regesta; min. 80%
KPI7.6; Efficacy of the algorithm for automatically producing regesta from sources (described in D7.1.2), to be assessed within M24; %age of incorrect summarisation of sources, production of meaningless or incorrect regesta; max. 20%
For pre-release phase;
KPI7.7; Feedback on REVER (user interface + algorithms, described in D7.1.3), to be assessed within M14, then m 22, then M28; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
KPI7.8; Feedback on REVER (user interface + algorithms, described in D7.1.3), to be assessed within M14, then m 22, then M28; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
For ex-post assessment;
KPI7.9; usage assessment of D7.3 (the online version of the Regesta Pontificum Romanorum), to be assessed within 3 years after the end of the project; # of users accessing the online database
General;
KPI8.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
KPI8.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
For development phase;
KPI8.1; Efficiency of the algorithm for finding Biblical and Quranic quotations in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M18; %age of correct quotations found; min. 90%
KPI8.2; Efficiency of the algorithm for finding Biblical and Quranic quotations in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M18; %age of wrong quotations found; max. 5%
KPI8.3; Efficiency of the algorithm for finding Biblical and Quranic quotations networks in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M20; %age of correct quotation networks found; min. 85%
KPI8.4; Efficiency of the algorithm for finding Biblical and Quranic quotations networks in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M20; %age of wrong quotation networks found; max. 10%
KPI8.5; Efficiency of the algorithm for suggesting Biblical and Quranic allusions in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M22; %age of correct allusions proposed; min. 75%
KPI8.6; Efficiency of the algorithm for suggesting Biblical and Quranic allusions in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M22; %age of wrong allusions proposed; max. 20%
For pre-release phase;
KPI8.7; Feedback on uBIQUity deployed (based on the analysis conducted for D8.1.3), to be assessed within M16, then m 22, then M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
KPI8.8; Feedback on uBIQUity deployed (based on the analysis conducted for D8.1.3), to be assessed within M16, then m 22, then M30; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
For ex-post assessment;
KPI8.9; usage assessment of D8.3 (the online version of Biblic and Quranic texts and commentaries, powered by uBIQUity), to be assessed within 3 years after the end of the project; # of users accessing the online database
General;
KPI9.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
KPI9.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
For development phase;
KPI9.1; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of correctly represented objects; min.95%
KPI9.2; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of incorrectly represented objects; max. 5%
KPI9.3; Efficacy of the MiRAr prototype, to be assessed within M12, then M22, then M24; %age of correctly represented spaces; min.95%
KPI9.4; Efficacy of the MiRAr prototype, to be assessed within M12, then M22, then M24; %age of incorrectly represented spaces; max. 5%
KPI9.5; Efficacy of the ACIS prototype, to be assessed within M12, then M22, then M24; %age of correct links between archives per object; min.80%
KPI9.6; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of incorrect links between archives per object; max. 20%
For pre-release phase;
KPI9.7; Feedback on the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI9.8; Feedback on the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within M30; %age of negative feedback from a sample of at least 30 scholars; min. 75%
For ex-post assessment;
KPI9.9; usage assessment of the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within 3 years after the end of the project; # of users accessing the online database.
General;
KPI10.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
KPI10.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
For development phase;
KPI10.1; Feedback on the Preliminary data model (D10.2), to be assessed within M18; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI10.2; Feedback on the Preliminary data model (D10.2), to be assessed within M18; %age of negative feedback from a sample of at least 30 scholars; max. 15%
KPI10.3; Feedback on Annotation tools and interactive interface (as described in D10.4), to be assessed within M12, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI10.4; Feedback on Annotation tools and interactive interface (as described in D10.4), to be assessed within M12, then M24, then M26; %age of negative feedback from a sample of at least 30 scholars; max. 15%
For pre-release phase;
KPI10.5; Feedback on ReTINA (as described in D10.5), to be assessed within M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%
KPI10.6; Feedback on ReTINA (as described in D10.5), to be assessed within M30; %age of negative feedback from a sample of at least 30 scholars; max. 15%
For ex-post assessment;
KPI10.7; usage assessment of ReTINA-powered database, to be assessed within 3 years after the end of the project; # of users accessing the online database. The specific TNA indicators are monitored each semester by the TNA Team Leader
For each TNA call for proposal cycle KPI11.1; Level of scientific relevance is measured by the percentage of the scientific Peer Review Committee (PRC) that analysed the proposals; Based on the level of presence of each PRC member; min. 90%
KPI11.2; Access provision efficiency measured by the ratio between planned Number of access users and actual users per call; min. 90%
KPI11.3; TNA efficiency is measured by the ratio between the proposals approved by the PRC and the access slots provided by ITSERR; min. 90%
KPI11.4; Access Quality is measured by the average of the evaluations from replied questionnaires by TNA users; min. 80% of satisfied users Deliverables quality and deadlines are monitored monthly by the Infrastructure Manager, the TNA Team Leader and the TNA WP leader
KPI11.5; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
KPI11.6; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
KPI11.7; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations Master Data Management events are monitored monthly
KPI12.1; Number of Master Data Model release back-outs; Based on the release data available; Zero releases
KPI12.2; Data test execution coverage; Total number of executed test cases by the total number of test cases planned to be executed (%). Information to be gathered from the Test Management tool; 100% of planned test cases are executed
KPI12.3; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
KPI12.4; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
KPI12.5; Delay in successful implementation of corrective action plan following quality audits; Number of working days of delay beyond expected implementation; zero working days
KPI12.6; # incident report due to quality issues with a deliverables; Number of deliverables not meeting the requirements defined in the DOP/QAP/SMP/CDP; max. 1 per month
KPI12.7; Implementation of updates to Configuration Management DataBase (CMDB) arising out of change management process; Based on data available in CMDB for every Configuration Item5 that required to be updated; One item per month not kept up-todate within 10 working days of change being made
KPI12.8; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations
1 ID;Title;Metric; Service Level
2 For incident management
3 KPI1.1; Incident acknowledgement time
4 KPI1.2; Incident intervention time
5 KPI1.3; Incident Resolution time
6 For IT Software Development Life Cycle events
7 KPI1.4; Number of release back-outs; Based on the release data available; Zeroreleases
8 KPI1.5; Number of defects in release/change in PRD - Incidents caused by new releases; Based on the total number of new Blocking, Major, Minor defects logged in defect management tool during the period; The number of defects in a release in PRODUCTION should not exceed the following threshold; zero Blocking, one Major, five Minor
9 KPI1.6; Defects not found during test execution in non-PRODUCTION ; Analysis of all new Blocking, Major, Minor defects logged in defect management tool in the PRODUCTION environment which had no record for the tests conducted in nonPRODUCTION environments; No new defects should be found in PRODUCTION
10 KPI1.7; Test execution coverage; Total number of executed test cases by the total number of test cases planned to be executed (%). Information to be gathered from the Test Management tool; 100% of planned test cases are executed For IT Software Development Life Cycle events
11 KPI1.13; Implementation of updates to Configuration Management DataBase (CMDB) arising out of change management process; Based on data available in CMDB for every Configuration Item4 that required to be updated; One item per month not kept up-to-date within 10 working days of change being made
12 KPI1.14; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations
13 The contractual duties are monitored bimonthly by the Board;
14 KPI1.15; Compliance with the delivery date of the bi-monthly progress report; number of elapsed working days between the effective delivery date and the expected due date of the report; max. 1 working day
15 KPI1.16; Compliance with the expected content of the bi-monthly progress report; Number of report not meeting the content requirements; Zero non-compliance
16 KPI1.17; Quality of the Financial report (including invoicing, evidences, etc.) submitted to the Ministry; Number of report not meeting the content requirements; Zero noncompliance
17 For incident management (Same as WP1 incident mgt KPI1.1 to 1.3)
18 For DC services operations events
19 KPI2.3; System/service availability; Uptime and availability of the PRODUCTION environment and related services as reported by the monitoring systems; The minimal availability is between 95,00% and 97,00 % over 28 rolling days
20 KPI2.4; ibidem for the TEST environment; The minimal availability is between 90,00% and 95,00 % over 28 rolling days
21 KPI2.5; ibidem for the DEVELOPMENT environment; The minimal availability is between 92,00% and 95,00 % over 28 rolling days
22 KPI2.6; Quantity of incidents due to capacity (CPU, Memory, storage)shortages; Based on data available in ticketing tool; Zero incidents should happen because of insufficient service or component capacity
23 KPI2.7; Quantity of incidents having impacted the security of the ITSERR environments; 0 per month
24 KPI2.8; Quantity of incidents having impacted the backup of the ITSERR environments; max. 1 per month
25 KPI2.9; Quantity of incidents not solved within the time frame foreseen by the service level; per month 0 for CRITICAL, 1 for HIGH, 3 for MODERATE and 5 for LOW.
26 KPI2.10; Quantity of system maintenance operations not performed by the DCS team (i.e. security patching, etc.); max. 1 per month Deliverables quality, deadlines and services levels performance are monitored monthly by the Infrastructure Manager and the DCS Operation Leader
27 KPI2.11; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
28 KPI2.12; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
29 KPI2.13; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
30 KPI2.14; Delay in successful implementation of corrective action plan following quality audits; Number of working days of delay beyond expected implementation; zero working days
31 KPI2.15; # incident report due to quality issues with a deliverables; Number of deliverables not meeting the requirements defined in the DOP/QAP/SMP/CDP; max. 1 per month
32 KPI2.16; Service Desk reachability through communication channels during working hours; 90% of the calls or emails to the Desk are acknowledged; min. 90%
33 For DCS services (ITIL Life Cycle) events for the Configuration Management DataBase, the KPIs are the same as for WP1 KPI1.3 and 1.14
34 General;
35 KPI3.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
36 KPI3.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
37 CRITERION
38 For development phase;
39 KPI3.1; Feedback on CRITERION prototype testers, to be assessed within M16, then M22, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
40 KPI3.2; Feedback on CRITERION prototype testers, to be assessed within M16, then M22, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
41 For ex-post assessment;
42 KPI3.3; usage assessment of CRITERION, to be assessed within 3 years after the end of the project; # of users accessing the online database.
43 GNORM
44 For development phase;
45 KPI3.4; efficiency of the data mining algorithm developed in A3.3, to be assessed within M20; %age of correctly extracted paratextual elements; min. 85%
46 KPI3.5; efficiency of the data mining algorithm developed in A3.3, to be assessed within M20; %age of incorrectly extracted paratextual elements; max. 15%
47 KPI3.6; efficiency of automatic properties annotation system, developed in A3.3, to be assessed within M24; %age of correctly assigned properties; min. 95%
48 KPI3.7; efficiency of automatic properties annotation system, developed in A3.3, to be assessed within M24; %age of incorrectly assigned properties; max. 5%
49 KPI3.8; Feedback from GNORM prototype testers, to be assessed within M18, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
50 KPI3.9; Feedback from GNORM prototype testers, to be assessed within M18, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
51 For ex-post assessment;
52 KPI3.10; usage assessment of GNORM-powered databases (D3.3), to be assessed within 3 years after the end of the project; # of users accessing the online databases General;
53 KPI4.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
54 KPI4.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
55 For development phase;
56 KPI4.1; Efficiency of word embedding extraction tool for DaMSym; to be assessed within M12; %age of correctly extracted properties; min. 90%
57 KPI4.2; Efficiency of word embedding extraction tool for DaMSym; to be assessed within M12; %age of incorrectly extracted properties; max. 10%
58 KPI4.3; Efficiency of the machine translation tool for DaMSym; to be assessed within M1; %age of correct translations; min. 85%
59 KPI4.4; Efficiency of the machine translation tool for DaMSym; to be assessed within M18; %age of incorrect translations; max. 10%
60 KPI4.5; Efficiency of the text classification and document understanding tool for DaMSym; to be assessed within M20; %age of correct classifications; min. 85%
61 KPI4.6; Efficiency of the text classification and document understanding tool for DaMSym; to be assessed within M20; %age of incorrect classifications; max. 15%
62 For pre-release phase;
63 KPI4.7; Feedback on DaMSym platform (as described in D4.1.1), to be assessed within M16, then M22, then M24; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
64 KPI4.7; Feedback on DaMSym platform (as described in D4.1.1), to be assessed within M16, then M22, then M24; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
65 For ex-post assessment;
66 KPI4.8; usage assessment of D4.3 (the published database on the Creed powered by DaMSym), to be assessed within 3 years after the end of the project; # of users accessing the database
67 General;
68 KPI5.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
69 KPI5.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
70 For development phase;
71 KPI5.1; Efficacy of algorithm for automatic text recognition (D5.1.1), to be assessed within M16; %age of correctly recognised text; min.90%
72 KPI5.2; Efficacy of algorithm for automatic text recognition (D5.1.1), to be assessed within M16; %age of incorrectly recognised text; max.10%
73 KPI5.3; Efficacy of algorithm for metadata extraction (D5.1.1), to be assessed within M16; %age of correctly extracted metadata; min.90%
74 KPI5.4; Efficacy of algorithm for metadata extraction (D5.1.1), to be assessed within M16; %age of incorrectly extracted metadata; max. 10%
75 KPI5.5; Efficacy of intelligent and automated cataloguing prototype (D5.1.2), to be assessed within M20, then M24, then M28; %age of correctly catalogued; min.90%
76 KPI5.6; Efficacy of intelligent and automated cataloguing prototype (D5.1.2), to be assessed within M20, then M24, then M28; %age of incorrectly catalogue entries; max.90%
77 For ex-post assessment;
78 KPI5.7; integration with national bibliographic systems, to be assessed within 3 years of the project; # of libraries using DigitalMaktaba
79 KPI5.8; use by the research communities, to be assessed within 3 years of the project; # of users accessing DigitalMaktaba-powered catalogues
80 General
81 KPI6.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
82 KPI6.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
83 For development phase;
84 KPI6.1; Efficacy of the algorithm for automatic metadata extraction on visual and audiovisual data (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of correctly extracted metadata; min.90%
85 KPI6.2; Efficacy of the algorithm for automatic metadata extraction on visual and audiovisual data (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of incorrectly extracted metadata; max.10%
86 KPI6.3; Efficacy of the algorithm for layout analysis to automatically extract texts and images (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of correctly extracted texts and images; min.90%
87 KPI6.4; Efficacy of the algorithm for layout analysis to automatically extract texts and images (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of incorrectly extracted texts and images; max.10%
88 For pre-release phase;
89 KPI6.5; Feedback on YASMINE applied to Sanctuaria (as described in D6.1.2), to be assessed within M28; %age of positive feedback from a sample of at least 30 scholars; min. 75%
90 KPI6.6; Feedback on YASMINE applied to Sanctuaria (as described in D6.1.2), to be assessed within M28; %age of negative feedback from a sample of at least 30 scholars; max. 15%
91 For ex-post assessment;
92 KPI6.7; usage assessment of D6.3 (the published Plorabunt and Sanctuaria databases), to be assessed within 3 years after the end of the project; # of users accessing the online database.
93 KPI6.8; usage assessment of D6.4 (the published YASMINE metascraper), to be assessed within 3 years after the end of the project; # of downloads
94 General;
95 KPI7.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
96 KPI7.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
97 For development phase;
98 KPI7.1; Efficacy of HTR/OCR tool (described in D7.1.2), to be assessed within M14; %age of correct recognition; min. 95%
99 KPI7.2; Efficacy of HTR/OCR tool (described in D7.1.2), to be assessed within M14; %age of wrong recognition; max. 5%
100 KPI7.3; Efficacy of the algorithm for linking regesta to their sources (described in D7.1.2), to be assessed within M18; %age of correct links from regesta to the source; min.90%
101 KPI7.4; Efficacy of the algorithm for linking regesta to their sources (described in D7.1.2), to be assessed within M18; %age of missing links from regesta to the source; max. 20%
102 KPI7.5; Efficacy of the algorithm for automatically producing regesta from sources (described in D7.1.2), to be assessed within M24; %age of correct summarisation of sources into meaningful regesta; min. 80%
103 KPI7.6; Efficacy of the algorithm for automatically producing regesta from sources (described in D7.1.2), to be assessed within M24; %age of incorrect summarisation of sources, production of meaningless or incorrect regesta; max. 20%
104 For pre-release phase;
105 KPI7.7; Feedback on REVER (user interface + algorithms, described in D7.1.3), to be assessed within M14, then m 22, then M28; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
106 KPI7.8; Feedback on REVER (user interface + algorithms, described in D7.1.3), to be assessed within M14, then m 22, then M28; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
107 For ex-post assessment;
108 KPI7.9; usage assessment of D7.3 (the online version of the Regesta Pontificum Romanorum), to be assessed within 3 years after the end of the project; # of users accessing the online database
109 General;
110 KPI8.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
111 KPI8.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
112 For development phase;
113 KPI8.1; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M18; %age of correct quotations found; min. 90%
114 KPI8.2; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M18; %age of wrong quotations found; max. 5%
115 KPI8.3; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations networks in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M20; %age of correct quotation networks found; min. 85%
116 KPI8.4; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations networks in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M20; %age of wrong quotation networks found; max. 10%
117 KPI8.5; Efficiency of the algorithm for suggesting Biblical and Qu’ranic allusions in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M22; %age of correct allusions proposed; min. 75%
118 KPI8.6; Efficiency of the algorithm for suggesting Biblical and Qu’ranic allusions in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M22; %age of wrong allusions proposed; max. 20%
119 For pre-release phase;
120 KPI8.7; Feedback on uBIQUity deployed (based on the analysis conducted for D8.1.3), to be assessed within M16, then m 22, then M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
121 KPI8.8; Feedback on uBIQUity deployed (based on the analysis conducted for D8.1.3), to be assessed within M16, then m 22, then M30; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
122 For ex-post assessment;
123 KPI8.9; usage assessment of D8.3 (the online version of Biblic and Qu’ranic texts and commentaries, powered by uBIQUity), to be assessed within 3 years after the end of the project; # of users accessing the online database
124 General;
125 KPI9.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
126 KPI9.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
127 For development phase;
128 KPI9.1; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of correctly represented objects; min.95%
129 KPI9.2; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of incorrectly represented objects; max. 5%
130 KPI9.3; Efficacy of the MiRAr prototype, to be assessed within M12, then M22, then M24; %age of correctly represented spaces; min.95%
131 KPI9.4; Efficacy of the MiRAr prototype, to be assessed within M12, then M22, then M24; %age of incorrectly represented spaces; max. 5%
132 KPI9.5; Efficacy of the ACIS prototype, to be assessed within M12, then M22, then M24; %age of correct links between archives per object; min.80%
133 KPI9.6; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of incorrect links between archives per object; max. 20%
134 For pre-release phase;
135 KPI9.7; Feedback on the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%
136 KPI9.8; Feedback on the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within M30; %age of negative feedback from a sample of at least 30 scholars; min. 75%
137 For ex-post assessment;
138 KPI9.9; usage assessment of the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within 3 years after the end of the project; # of users accessing the online database.
139 General;
140 KPI10.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
141 KPI10.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
142 For development phase;
143 KPI10.1; Feedback on the Preliminary data model (D10.2), to be assessed within M18; %age of positive feedback from a sample of at least 30 scholars; min. 75%
144 KPI10.2; Feedback on the Preliminary data model (D10.2), to be assessed within M18; %age of negative feedback from a sample of at least 30 scholars; max. 15%
145 KPI10.3; Feedback on Annotation tools and interactive interface (as described in D10.4), to be assessed within M12, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
146 KPI10.4; Feedback on Annotation tools and interactive interface (as described in D10.4), to be assessed within M12, then M24, then M26; %age of negative feedback from a sample of at least 30 scholars; max. 15%
147 For pre-release phase;
148 KPI10.5; Feedback on ReTINA (as described in D10.5), to be assessed within M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%
149 KPI10.6; Feedback on ReTINA (as described in D10.5), to be assessed within M30; %age of negative feedback from a sample of at least 30 scholars; max. 15%
150 For ex-post assessment;
151 KPI10.7; usage assessment of ReTINA-powered database, to be assessed within 3 years after the end of the project; # of users accessing the online database. The specific TNA indicators are monitored each semester by the TNA Team Leader
152 For each TNA call for proposal cycle KPI11.1; Level of scientific relevance is measured by the percentage of the scientific Peer Review Committee (PRC) that analysed the proposals; Based on the level of presence of each PRC member; min. 90%
153 KPI11.2; Access provision efficiency measured by the ratio between planned Number of access users and actual users per call; min. 90%
154 KPI11.3; TNA efficiency is measured by the ratio between the proposals approved by the PRC and the access slots provided by ITSERR; min. 90%
155 KPI11.4; Access Quality is measured by the average of the evaluations from replied questionnaires by TNA users; min. 80% of satisfied users Deliverables quality and deadlines are monitored monthly by the Infrastructure Manager, the TNA Team Leader and the TNA WP leader
156 KPI11.5; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
157 KPI11.6; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
158 KPI11.7; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations Master Data Management events are monitored monthly
159 KPI12.1; Number of Master Data Model release back-outs; Based on the release data available; Zero releases
160 KPI12.2; Data test execution coverage; Total number of executed test cases by the total number of test cases planned to be executed (%). Information to be gathered from the Test Management tool; 100% of planned test cases are executed
161 KPI12.3; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
162 KPI12.4; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
163 KPI12.5; Delay in successful implementation of corrective action plan following quality audits; Number of working days of delay beyond expected implementation; zero working days
164 KPI12.6; # incident report due to quality issues with a deliverables; Number of deliverables not meeting the requirements defined in the DOP/QAP/SMP/CDP; max. 1 per month
165 KPI12.7; Implementation of updates to Configuration Management DataBase (CMDB) arising out of change management process; Based on data available in CMDB for every Configuration Item5 that required to be updated; One item per month not kept up-todate within 10 working days of change being made
166 KPI12.8; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations

Binary file not shown.

After

Width:  |  Height:  |  Size: 44 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 131 KiB