We have recently made an update to the data tracking logic used in the Training module on the platform.
The update will remove the ‘time played’ requirement for ‘Security Maturity’ and will align tracked metrics like “Accuracy”, “Time spent”, “Challenges played” and “Confidence Level” to those reported in the Courses module, ensuring a more consistent view of your developer’s overall progress and the success of your application security program.If you have any questions on how this update may affect your reporting, please contact customer support.
As with many things in the world, it just comes down to the numbers sometimes. That’s why we love data and insights. First things first, you have to acknowledge who will have access to the metrics and statistics areas of the Secure Code Warrior® platform.
Company Administrators have full access to the metrics for the company while Team Managers only have access to their assigned teams. Developers will have access to their own individual metrics and statistics, though they can still see where they stand on the training leaderboard.
Check out the video below, or keep scrolling to read more details about the kind of metrics you can use.
This article defines key terms and covers management metrics and statistics for;
Accuracy is a ratio between the number of attempts made and those answered correctly or incorrectly.
Accuracy is calculated by dividing the number of correct attempts by the total of both incorrect and correct attempts and multiplying that by 100.
Accuracy = (# Correct attempts /# of total attempts) * 100
For example, a Player answering everything correctly the first time will have a high Accuracy score while a Player that answers incorrectly or 'guesses' more often would have a lower Accuracy score.
Confidence is a ratio between the total number of Hints available and those used.
Confidence = (#total hints - #hints used)/#total hints
#total hints are hints that are available for challenges that the user has completed.
For example: if a player doesn't use any Hints they'll have a high confidence score. Players who frequently use Hints will have a lower confidence score.
How is Time measured?
Challenge Time Played is the amount of time a player has spent actively in training. The time is tracked once a challenge has started and stopped once they submit an answer. If the player navigates away from the challenge without submitting an answer or is inactive (ie. no mouse movement, keyboard interaction, or touch gesture) for more than 5 minutes, the time spent on that specific challenge is discarded.
Team Managers and Company Administrators can also see the total time a player interacts on the platform in Learning Resources (Videos), Training, Assessments, Tournaments. This is referred to as Time Spent.
Time Spent is determined by monitoring the active Playing mode every 30 seconds. If the player navigates away from the platform (closes the browser window or switches browser tabs) or is inactive for more than 10 minutes, the time spent is discarded.
How are experience Points calculated?
The possible number of Points earned for each challenge is determined by the Secure Code Warrior Security Competency Algorithm Metric or SCAM, much like the famous Google Algorithm for search. SCAM calculates a number of factors such as; Playing Stage, Challenge Difficulty, Application Type, Hint Used, and Failed Attempts (or guesses!) to derive the Players' Accuracy, Confidence, and Points.
Training Metrics and Statistics
As a Company Administrator or Team Manager, select VIEW METRICS or select Training from the top menu Metrics tab as shown below.
As Company Administrator with the Company selected you'll be able to report on and analyze the entire company. The reports can be filtered by Team and Language:Framework to be more specific. As a Team Manager, you'll only be able to see teams you've been assigned.
The top section highlights knowledge distribution across your team and strengths/weaknesses by the vulnerability category.
- Developer knowledge distribution which details the number of developers in each security knowledge maturity level. Security Maturity may be disabled in Company Configuration.
- Developer strengths and weaknesses in a spider diagram located on the right. This helps identify where to focus on ongoing training (aka tournaments and courses)
NOTE: Security Skill Scorecard does not include points earned through Courses, Tournaments or Assessments.
To deep-dive into a specific Team or Developer results simply click Team or Developer panel. Alternatively, it can be accessed via Administration in the top menu.
Courses Metrics and Statistics
As a Company Administrator or Team Manager, select Courses from the top menu Metrics tab as shown below. Alternatively, select VIEW METRICS from the All Courses panel.
As Company Administrator or Team Manager select the desired team to view the Course Progress Summary and Progress or Points Leaderboard. As a Team Manager, you'll only be able to see teams you've been assigned.
The summary can be filtered by date or Course status (ie. Published, Unpublished, Expired and Archived)
The Course Progress Summary displays Course Name, Description, Status, Deadline, Created By, Players Invited, Players Enrolled, Players Completed, % Completed, Language:Framework, and Average Time Spent.
The Leaderboard displays progress by team and players and shows progress by Course. For the Team Progress, the number of Developers, Correct Attempts, Total Attempts, Accuracy, Confidence, and Challenge Time Spent. For the Player Progress, Language:Frameworks and videos watch are also shown.
With the desired team selected click EXPORT CSVS, to choose the desired report to export.
All of this is available on both CSV and API downloads, which can be rolled into a company level, and offers additional information on training activity across the organization.