Commit automatico: D1.2 Quality Assessment Plan (QAP)
This commit is contained in:
parent
55754958aa
commit
ed75669158
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
|
After Width: | Height: | Size: 83 KiB |
|
|
@ -0,0 +1,166 @@
|
|||
ID;Title;Metric; Service Level
|
||||
For incident management
|
||||
KPI1.1; Incident acknowledgement time
|
||||
KPI1.2; Incident intervention time
|
||||
KPI1.3; Incident Resolution time
|
||||
For IT Software Development Life Cycle events
|
||||
KPI1.4; Number of release back-outs; Based on the release data available; Zeroreleases
|
||||
KPI1.5; Number of defects in release/change in PRD - Incidents caused by new releases; Based on the total number of new Blocking, Major, Minor defects logged in defect management tool during the period; The number of defects in a release in PRODUCTION should not exceed the following threshold; zero Blocking, one Major, five Minor
|
||||
KPI1.6; Defects not found during test execution in non-PRODUCTION ; Analysis of all new Blocking, Major, Minor defects logged in defect management tool in the PRODUCTION environment which had no record for the tests conducted in nonPRODUCTION environments; No new defects should be found in PRODUCTION
|
||||
KPI1.7; Test execution coverage; Total number of executed test cases by the total number of test cases planned to be executed (%). Information to be gathered from the Test Management tool; 100% of planned test cases are executed For IT Software Development Life Cycle events
|
||||
KPI1.13; Implementation of updates to Configuration Management DataBase (CMDB) arising out of change management process; Based on data available in CMDB for every Configuration Item4 that required to be updated; One item per month not kept up-to-date within 10 working days of change being made
|
||||
KPI1.14; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations
|
||||
The contractual duties are monitored bimonthly by the Board;
|
||||
KPI1.15; Compliance with the delivery date of the bi-monthly progress report; number of elapsed working days between the effective delivery date and the expected due date of the report; max. 1 working day
|
||||
KPI1.16; Compliance with the expected content of the bi-monthly progress report; Number of report not meeting the content requirements; Zero non-compliance
|
||||
KPI1.17; Quality of the Financial report (including invoicing, evidences, etc.) submitted to the Ministry; Number of report not meeting the content requirements; Zero noncompliance
|
||||
For incident management (Same as WP1 incident mgt KPI1.1 to 1.3)
|
||||
For DC services operations events
|
||||
KPI2.3; System/service availability; Uptime and availability of the PRODUCTION environment and related services as reported by the monitoring systems; The minimal availability is between 95,00% and 97,00 % over 28 rolling days
|
||||
KPI2.4; ibidem for the TEST environment; The minimal availability is between 90,00% and 95,00 % over 28 rolling days
|
||||
KPI2.5; ibidem for the DEVELOPMENT environment; The minimal availability is between 92,00% and 95,00 % over 28 rolling days
|
||||
KPI2.6; Quantity of incidents due to capacity (CPU, Memory, storage)shortages; Based on data available in ticketing tool; Zero incidents should happen because of insufficient service or component capacity
|
||||
KPI2.7; Quantity of incidents having impacted the security of the ITSERR environments; 0 per month
|
||||
KPI2.8; Quantity of incidents having impacted the backup of the ITSERR environments; max. 1 per month
|
||||
KPI2.9; Quantity of incidents not solved within the time frame foreseen by the service level; per month 0 for CRITICAL, 1 for HIGH, 3 for MODERATE and 5 for LOW.
|
||||
KPI2.10; Quantity of system maintenance operations not performed by the DCS team (i.e. security patching, etc.); max. 1 per month Deliverables quality, deadlines and services levels performance are monitored monthly by the Infrastructure Manager and the DCS Operation Leader
|
||||
KPI2.11; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
|
||||
KPI2.12; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
|
||||
KPI2.13; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
|
||||
KPI2.14; Delay in successful implementation of corrective action plan following quality audits; Number of working days of delay beyond expected implementation; zero working days
|
||||
KPI2.15; # incident report due to quality issues with a deliverables; Number of deliverables not meeting the requirements defined in the DOP/QAP/SMP/CDP; max. 1 per month
|
||||
KPI2.16; Service Desk reachability through communication channels during working hours; 90% of the calls or emails to the Desk are acknowledged; min. 90%
|
||||
For DCS services (ITIL Life Cycle) events for the Configuration Management DataBase, the KPIs are the same as for WP1 KPI1.3 and 1.14
|
||||
General;
|
||||
KPI3.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
|
||||
KPI3.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
|
||||
CRITERION
|
||||
For development phase;
|
||||
KPI3.1; Feedback on CRITERION prototype testers, to be assessed within M16, then M22, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI3.2; Feedback on CRITERION prototype testers, to be assessed within M16, then M22, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
For ex-post assessment;
|
||||
KPI3.3; usage assessment of CRITERION, to be assessed within 3 years after the end of the project; # of users accessing the online database.
|
||||
GNORM
|
||||
For development phase;
|
||||
KPI3.4; efficiency of the data mining algorithm developed in A3.3, to be assessed within M20; %age of correctly extracted paratextual elements; min. 85%
|
||||
KPI3.5; efficiency of the data mining algorithm developed in A3.3, to be assessed within M20; %age of incorrectly extracted paratextual elements; max. 15%
|
||||
KPI3.6; efficiency of automatic properties annotation system, developed in A3.3, to be assessed within M24; %age of correctly assigned properties; min. 95%
|
||||
KPI3.7; efficiency of automatic properties annotation system, developed in A3.3, to be assessed within M24; %age of incorrectly assigned properties; max. 5%
|
||||
KPI3.8; Feedback from GNORM prototype testers, to be assessed within M18, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI3.9; Feedback from GNORM prototype testers, to be assessed within M18, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
For ex-post assessment;
|
||||
KPI3.10; usage assessment of GNORM-powered databases (D3.3), to be assessed within 3 years after the end of the project; # of users accessing the online databases General;
|
||||
KPI4.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
|
||||
KPI4.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
|
||||
For development phase;
|
||||
KPI4.1; Efficiency of word embedding extraction tool for DaMSym; to be assessed within M12; %age of correctly extracted properties; min. 90%
|
||||
KPI4.2; Efficiency of word embedding extraction tool for DaMSym; to be assessed within M12; %age of incorrectly extracted properties; max. 10%
|
||||
KPI4.3; Efficiency of the machine translation tool for DaMSym; to be assessed within M1; %age of correct translations; min. 85%
|
||||
KPI4.4; Efficiency of the machine translation tool for DaMSym; to be assessed within M18; %age of incorrect translations; max. 10%
|
||||
KPI4.5; Efficiency of the text classification and document understanding tool for DaMSym; to be assessed within M20; %age of correct classifications; min. 85%
|
||||
KPI4.6; Efficiency of the text classification and document understanding tool for DaMSym; to be assessed within M20; %age of incorrect classifications; max. 15%
|
||||
For pre-release phase;
|
||||
KPI4.7; Feedback on DaMSym platform (as described in D4.1.1), to be assessed within M16, then M22, then M24; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
|
||||
KPI4.7; Feedback on DaMSym platform (as described in D4.1.1), to be assessed within M16, then M22, then M24; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
|
||||
For ex-post assessment;
|
||||
KPI4.8; usage assessment of D4.3 (the published database on the Creed powered by DaMSym), to be assessed within 3 years after the end of the project; # of users accessing the database
|
||||
General;
|
||||
KPI5.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
|
||||
KPI5.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
|
||||
For development phase;
|
||||
KPI5.1; Efficacy of algorithm for automatic text recognition (D5.1.1), to be assessed within M16; %age of correctly recognised text; min.90%
|
||||
KPI5.2; Efficacy of algorithm for automatic text recognition (D5.1.1), to be assessed within M16; %age of incorrectly recognised text; max.10%
|
||||
KPI5.3; Efficacy of algorithm for metadata extraction (D5.1.1), to be assessed within M16; %age of correctly extracted metadata; min.90%
|
||||
KPI5.4; Efficacy of algorithm for metadata extraction (D5.1.1), to be assessed within M16; %age of incorrectly extracted metadata; max. 10%
|
||||
KPI5.5; Efficacy of intelligent and automated cataloguing prototype (D5.1.2), to be assessed within M20, then M24, then M28; %age of correctly catalogued; min.90%
|
||||
KPI5.6; Efficacy of intelligent and automated cataloguing prototype (D5.1.2), to be assessed within M20, then M24, then M28; %age of incorrectly catalogue entries; max.90%
|
||||
For ex-post assessment;
|
||||
KPI5.7; integration with national bibliographic systems, to be assessed within 3 years of the project; # of libraries using DigitalMaktaba
|
||||
KPI5.8; use by the research communities, to be assessed within 3 years of the project; # of users accessing DigitalMaktaba-powered catalogues
|
||||
General
|
||||
KPI6.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
|
||||
KPI6.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
|
||||
For development phase;
|
||||
KPI6.1; Efficacy of the algorithm for automatic metadata extraction on visual and audiovisual data (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of correctly extracted metadata; min.90%
|
||||
KPI6.2; Efficacy of the algorithm for automatic metadata extraction on visual and audiovisual data (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of incorrectly extracted metadata; max.10%
|
||||
KPI6.3; Efficacy of the algorithm for layout analysis to automatically extract texts and images (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of correctly extracted texts and images; min.90%
|
||||
KPI6.4; Efficacy of the algorithm for layout analysis to automatically extract texts and images (described in D6.1.1), to be assessed within M12, then M18, then M26; %age of incorrectly extracted texts and images; max.10%
|
||||
For pre-release phase;
|
||||
KPI6.5; Feedback on YASMINE applied to Sanctuaria (as described in D6.1.2), to be assessed within M28; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI6.6; Feedback on YASMINE applied to Sanctuaria (as described in D6.1.2), to be assessed within M28; %age of negative feedback from a sample of at least 30 scholars; max. 15%
|
||||
For ex-post assessment;
|
||||
KPI6.7; usage assessment of D6.3 (the published Plorabunt and Sanctuaria databases), to be assessed within 3 years after the end of the project; # of users accessing the online database.
|
||||
KPI6.8; usage assessment of D6.4 (the published YASMINE metascraper), to be assessed within 3 years after the end of the project; # of downloads
|
||||
General;
|
||||
KPI7.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
|
||||
KPI7.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
|
||||
For development phase;
|
||||
KPI7.1; Efficacy of HTR/OCR tool (described in D7.1.2), to be assessed within M14; %age of correct recognition; min. 95%
|
||||
KPI7.2; Efficacy of HTR/OCR tool (described in D7.1.2), to be assessed within M14; %age of wrong recognition; max. 5%
|
||||
KPI7.3; Efficacy of the algorithm for linking regesta to their sources (described in D7.1.2), to be assessed within M18; %age of correct links from regesta to the source; min.90%
|
||||
KPI7.4; Efficacy of the algorithm for linking regesta to their sources (described in D7.1.2), to be assessed within M18; %age of missing links from regesta to the source; max. 20%
|
||||
KPI7.5; Efficacy of the algorithm for automatically producing regesta from sources (described in D7.1.2), to be assessed within M24; %age of correct summarisation of sources into meaningful regesta; min. 80%
|
||||
KPI7.6; Efficacy of the algorithm for automatically producing regesta from sources (described in D7.1.2), to be assessed within M24; %age of incorrect summarisation of sources, production of meaningless or incorrect regesta; max. 20%
|
||||
For pre-release phase;
|
||||
KPI7.7; Feedback on REVER (user interface + algorithms, described in D7.1.3), to be assessed within M14, then m 22, then M28; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
|
||||
KPI7.8; Feedback on REVER (user interface + algorithms, described in D7.1.3), to be assessed within M14, then m 22, then M28; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
|
||||
For ex-post assessment;
|
||||
KPI7.9; usage assessment of D7.3 (the online version of the Regesta Pontificum Romanorum), to be assessed within 3 years after the end of the project; # of users accessing the online database
|
||||
General;
|
||||
KPI8.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days.
|
||||
KPI8.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month.
|
||||
For development phase;
|
||||
KPI8.1; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M18; %age of correct quotations found; min. 90%
|
||||
KPI8.2; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M18; %age of wrong quotations found; max. 5%
|
||||
KPI8.3; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations networks in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M20; %age of correct quotation networks found; min. 85%
|
||||
KPI8.4; Efficiency of the algorithm for finding Biblical and Qu’ranic quotations networks in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M20; %age of wrong quotation networks found; max. 10%
|
||||
KPI8.5; Efficiency of the algorithm for suggesting Biblical and Qu’ranic allusions in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M22; %age of correct allusions proposed; min. 75%
|
||||
KPI8.6; Efficiency of the algorithm for suggesting Biblical and Qu’ranic allusions in the respective corpora of commentaries (used in uBIQUity as described in D8.1.2); to be assessed within M22; %age of wrong allusions proposed; max. 20%
|
||||
For pre-release phase;
|
||||
KPI8.7; Feedback on uBIQUity deployed (based on the analysis conducted for D8.1.3), to be assessed within M16, then m 22, then M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%.
|
||||
KPI8.8; Feedback on uBIQUity deployed (based on the analysis conducted for D8.1.3), to be assessed within M16, then m 22, then M30; %age of negative feedback from a sample of at least 30 scholars; max. 10%.
|
||||
For ex-post assessment;
|
||||
KPI8.9; usage assessment of D8.3 (the online version of Biblic and Qu’ranic texts and commentaries, powered by uBIQUity), to be assessed within 3 years after the end of the project; # of users accessing the online database
|
||||
General;
|
||||
KPI9.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
|
||||
KPI9.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
|
||||
For development phase;
|
||||
KPI9.1; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of correctly represented objects; min.95%
|
||||
KPI9.2; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of incorrectly represented objects; max. 5%
|
||||
KPI9.3; Efficacy of the MiRAr prototype, to be assessed within M12, then M22, then M24; %age of correctly represented spaces; min.95%
|
||||
KPI9.4; Efficacy of the MiRAr prototype, to be assessed within M12, then M22, then M24; %age of incorrectly represented spaces; max. 5%
|
||||
KPI9.5; Efficacy of the ACIS prototype, to be assessed within M12, then M22, then M24; %age of correct links between archives per object; min.80%
|
||||
KPI9.6; Efficacy of the EnLil prototype, to be assessed within M12, then M22, then M24; %age of incorrect links between archives per object; max. 20%
|
||||
For pre-release phase;
|
||||
KPI9.7; Feedback on the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI9.8; Feedback on the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within M30; %age of negative feedback from a sample of at least 30 scholars; min. 75%
|
||||
For ex-post assessment;
|
||||
KPI9.9; usage assessment of the TAURUS toolkit (EnLil, MiRAr and ACIS, as described in D9.4), to be assessed within 3 years after the end of the project; # of users accessing the online database.
|
||||
General;
|
||||
KPI10.0a; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
|
||||
KPI10.0b; Compliance with the expected content of the deliverables as agreed in the (DTM); Number of reports not meeting the content requirements; 1max. noncompliance/month
|
||||
For development phase;
|
||||
KPI10.1; Feedback on the Preliminary data model (D10.2), to be assessed within M18; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI10.2; Feedback on the Preliminary data model (D10.2), to be assessed within M18; %age of negative feedback from a sample of at least 30 scholars; max. 15%
|
||||
KPI10.3; Feedback on Annotation tools and interactive interface (as described in D10.4), to be assessed within M12, then M24, then M26; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI10.4; Feedback on Annotation tools and interactive interface (as described in D10.4), to be assessed within M12, then M24, then M26; %age of negative feedback from a sample of at least 30 scholars; max. 15%
|
||||
For pre-release phase;
|
||||
KPI10.5; Feedback on ReTINA (as described in D10.5), to be assessed within M30; %age of positive feedback from a sample of at least 30 scholars; min. 75%
|
||||
KPI10.6; Feedback on ReTINA (as described in D10.5), to be assessed within M30; %age of negative feedback from a sample of at least 30 scholars; max. 15%
|
||||
For ex-post assessment;
|
||||
KPI10.7; usage assessment of ReTINA-powered database, to be assessed within 3 years after the end of the project; # of users accessing the online database. The specific TNA indicators are monitored each semester by the TNA Team Leader
|
||||
For each TNA call for proposal cycle KPI11.1; Level of scientific relevance is measured by the percentage of the scientific Peer Review Committee (PRC) that analysed the proposals; Based on the level of presence of each PRC member; min. 90%
|
||||
KPI11.2; Access provision efficiency measured by the ratio between planned Number of access users and actual users per call; min. 90%
|
||||
KPI11.3; TNA efficiency is measured by the ratio between the proposals approved by the PRC and the access slots provided by ITSERR; min. 90%
|
||||
KPI11.4; Access Quality is measured by the average of the evaluations from replied questionnaires by TNA users; min. 80% of satisfied users Deliverables quality and deadlines are monitored monthly by the Infrastructure Manager, the TNA Team Leader and the TNA WP leader
|
||||
KPI11.5; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
|
||||
KPI11.6; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
|
||||
KPI11.7; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations Master Data Management events are monitored monthly
|
||||
KPI12.1; Number of Master Data Model release back-outs; Based on the release data available; Zero releases
|
||||
KPI12.2; Data test execution coverage; Total number of executed test cases by the total number of test cases planned to be executed (%). Information to be gathered from the Test Management tool; 100% of planned test cases are executed
|
||||
KPI12.3; Compliance with delivery dates agreed in the Deliverable Tracking Matrix (DTM); number of elapsed working days between the effective delivery date and the expected due date of the deliverable; max. 5 working days
|
||||
KPI12.4; Compliance with agreed planned activities; Number of activities that deviate from the agreed set of activities; max. 2 per month
|
||||
KPI12.5; Delay in successful implementation of corrective action plan following quality audits; Number of working days of delay beyond expected implementation; zero working days
|
||||
KPI12.6; # incident report due to quality issues with a deliverables; Number of deliverables not meeting the requirements defined in the DOP/QAP/SMP/CDP; max. 1 per month
|
||||
KPI12.7; Implementation of updates to Configuration Management DataBase (CMDB) arising out of change management process; Based on data available in CMDB for every Configuration Item5 that required to be updated; One item per month not kept up-todate within 10 working days of change being made
|
||||
KPI12.8; Number of Knowledge Management (KM) artefacts with an expired review date; Date of last update (based on document control information) uploaded against data of latest upload version; Zero (0) deviations
|
||||
|
Binary file not shown.
|
After Width: | Height: | Size: 44 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 54 KiB |
Binary file not shown.
|
After Width: | Height: | Size: 131 KiB |
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading…
Reference in New Issue