Labgram #111/Valgram #131 - CCTL Evaluation Test Evidence Standards
Validators and CCTLs,
This Labgram clarifies CCTL evaluation test evidence reporting as specified in NVLAP Handbooks 150 and 150-20 to ensure all CCTLs fully document the test environment.
NVLAP Handbooks 150 and 150-20 include requirements for submitting evaluation reports to NIAP CCEVS. In particular, NVLAP Handbook 150-20 Sections 5.10.1 and 5.10.3 state the following:
“5.10.1 The laboratory shall issue evaluation reports of its work that accurately, clearly, and unambiguously present the evaluator analysis, test conditions, test setup, test and evaluation results, and all other required information.
5.10.3 Evaluation reports created for submission to the CCEVS shall meet the requirements of the CCEVS and all NIAP reporting requirements. The evaluation report shall contain sufficient information for the exact test conditions and results to be reproduced later if a reexamination or retest is required. Evaluation reports shall be submitted in the form and by the method specified by CCEVS.”
A clear description of the test network is critical in analyzing the test results and putting them into context. This Labgram addresses the “Big Picture” in understanding how the CCTL tests the TOE.
For networked TOEs, all items listed below must be included in the CCTL’s Test Report.
o A network architecture diagram depicting both physical and virtualized elements within the test environment boundary. The diagram must:
o clearly mark all devices present, to include any devices which are present but do not support the testing.
o depict the points where test evidence (e.g., packet capture, logs) is extracted.
o depict where each device is physically located. If devices are located in separate physical locations, a description of how communications are protected is required.
The network diagram reflects the test network as constituted for each test – multiple diagrams may be required to convey all information clearly and accurately.
o For each device (physical or virtualized) in the test network:
o Device Name
o Test Accounts and Credentials (e.g., username, password)
o Function Performed (e.g., audit server, auth server, CA server)
o IP Address/Subnet Mask(s)
o MAC Address
o Protocols Used (e.g., TLS, IPSec)
o Timing Source – How the device is time synced – devices in the test environment must be time synchronized to support the correlation of evidence. (e.g., logs/packet captures)
o Version of OS software/firmware
o Tools including version (e.g., wireshark, nmap, vpn client)
The version of the protocols (e.g., version of TLS) may vary from test to test. The function(s) a device performs or its IP addresses may also vary. The test report must provide the proper information for the given test and align with the evidence delivered as part of the check-out package.
In the future, more formal test procedure descriptions and structure for individual test cases will be provided. Standardizing test evidence ensures that tests are reproducible and repeatable. Moving to standardized practices and presentation of evidence allows for automation and builds confidence in the evaluation evidence presented and the rigor of the evaluation team.
If you have any questions or concerns, please contact us at 410-854-4458 or by email firstname.lastname@example.org.
Thank you, NIAP Staff
The information contained herein is for the exclusive use of Government and Contractor personnel with a need-to-know for NIAP CCEVS information. Such information is specifically prohibited from posting on unrestricted bulletin boards or other unlimited access applications.