The Big Picture
The Big Picture report shows an overview analysis of your database. It has four sections:
- Summary statements and statistics,
- Distribution of incidents by test phase,
- Distribution of incidents by lab function, and
- A time trend. The exact content varies depending on what fields appear on your input form.
Summary Statements
Here as an example, with a description of what input fields the statement is based on. In the Source column, fields are identified by names used in the Example database -- these names can be changed by the user. Physical field names are shown in parentheses.
Except where noted, the phrase does not appear on the report if the applicable field does not appear on the input form.
|
Statement |
Source |
|
NOTE: By default, this report excludes Administrative incidents, and incidents that have not yet been classified to a test phase. |
Classification of the problem by test phase (ProblemClass). The report only includes incidents classified as Pre-Analytic, Analytic, or Post-Analytic, unless you check the box in Preferences to include all classified incidents. |
|
For 1,233 incidents reported between 06 Dec 2002 and 03 Apr 2005: |
Date reported (ProblemDate). By default, the report covers the latest 90 days. You can change this with the File/Preferences menu command. |
|
1,550 samples were affected, an average of 1.26 samples per incident |
Sum of Number of samples affected (NumSmpls).If this field does not appear on your input form, the number of samples affected is the same as the number of incident reports. |
|
Error Rate: Number of errors per 1000 samples = 1.11, assuming a typical volume of 50,000 samples/month x 27.91 months = 1,395,616 samples |
The typical monthly sample volume is set with File/Preferences. If you don't set it, the error rate is omitted from the report. |
|
This error rate is equivalent to a Sigma Metric of 4.6σ |
To omit this item, uncheck Show Sigma Metric in File/Preferences. The sigma metric is the equivalent number of SDs from a Gaussian distribution. Example: an error rate of 5% = 2σ; an error rate of 1% = 3σ. |
|
Patient Safety: Patient safety impact was evaluated for 984 (80%) of incidents. |
What was the actual effect on the patient? (ActualPtOutcome) - number of incidents for which a value was assigned. In other words, <Unassigned> means patient safety impact was not evaluated. |
|
For these incidents, the average severity of patient outcome was 1.31 (1=No Effect, 2=Minor, 3=Severe) |
Total Severity divided by total number of incidents. Example: Incident 1 has severity 1, Incident 2 has severity 2, Incident 3 has severity 3. Average Severity = (1+2+3)/3. |
|
1,149 (93%) of incidents were potentially serious adverse events (near misses) |
Yes response to Was this a potentially serious adverse event? (NearMiss) |
|
489 (40%) caused specimen to be redrawn |
Yes response to Was the specimen redrawn or recollected? (Redrawn) |
|
1,052 (85%) caused a delay in reporting results |
Yes response to Was there a delay in reporting test results? (ResultsDelay) |
|
360 (29%) caused incorrect results to be reported |
Yes response to Were incorrect results reported? (ResultsIncorrect) |
|
Lab was responsible for 693 (56%) of incidents |
Yes response to Was the lab responsible for the error? (LabResponsible) |
|
500 (72%) of incidents for which lab was responsible were preventable |
Yes responses to Was this problem preventable (Preventable). |
|
211 (30%) of incidents for which lab was responsible were cognitive (mistakes that involve misinterpretation of results, or faulty decision-making by the technologist) |
Was this problem cognitive or non-cognitive? (CogNonCog) |
|
The top 3 problems were:
|
Classification of problem by test phase (ProblemClass) |
Distribution by Test Phase and Lab Function
Click on a line item to browse the associated incidents.
Time Trend
By default, the chart covers all data in the database. Use File/Preferences to change the time span.