Select Page

ETC Solutions – The Examiners and Observers function.


The Examiners and Observers function in ETC Solutions

The fundamental role of the Examiner is to select the best volunteer for a job position with the support of the Observers.

There are three different levels where the Examiner and the Observers review the candidate’s metrics such as;

Sessions scores, Questions – Video Responses, and Support information:

Let’s start at the top level

The “EMOTIONAL Fingerprint” consolidates automated metrics and evaluations of each candidate, plus manual scores by the examiner to help decide who are the best volunteer for a job position.







The “Diagnose & Evaluate” module is a working space where the Examiner analyzes the sessions recorded by the Volunteer (Candidate/Employee) and includes the manual assessment of Climate and Values.


The Key Performance Indicators at the Question and Session levels allow the examiner to drill down and carry out a candidate’s detailed assessment in each question.

Now let’s review in detail, how the Examiner should execute the function:

In the “EMOTIONAL Fingerprint”,

the examiner reviews the candidates’ progress for a Job Position in terms of the number of sessions completed.

The ETC solutions main metrics are the “Trust Factor,” “P/N – Positive versus Negative” emotions, and finally the “Danger Alert”, all critical elements in the evaluation process.


Metric Interpretation Guidelines:

Alert / Warning – Variations and correlations that are activated automatically can indicate a reaction that should be deepened with complementary questions.


P/N – Positive to Negative Emotions. It is understood as a correlation between the possible negative emotions that are registered vs. the positive emotions that are experienced result being the percentage higher means that the person has fewer negative alterations.


Danger Alert – Angers the rest of the emotions, which is indicative of a necessary deeper evaluation of the question by the examiner.


Soft Questions (Soft) – Hard (Specific or Research): to determine a base of answers that calibrate the emotional registers of the person throughout the examination.


TRUST f is the ratio of emotions P/N of the Hard questions over the Soft ones. The lower the percentage, the greater the need for examination.


Manual Metrics – Assigned by Examiner


Good Volunteer – Examiner evaluation that is carried out on each question or value to achieve the qualification of the Volunteer.

When evaluating a volunteer’s response, several key criteria must be considered. Here are some evaluation criteria you can use to assess the quality and effectiveness of a response:

Accuracy and Knowledge: Assess the volunteer’s understanding of the topic and their ability to provide accurate and factual information. Look for relevant and up-to-date knowledge that demonstrates a strong grasp of the subject matter.

Clarity and Organization: Evaluate how well the candidate communicates their ideas. Look for clear and concise language, a logical flow of thoughts, and well-structured paragraphs or sections. A well-organized response is easier to follow and understand.

Depth and Detail: Consider the level of depth and detail in the response. A strong response should go beyond surface-level explanations and provide insightful and comprehensive information. Look for examples, supporting evidence, and relevant details that enhance the argument or explanations.

Critical Thinking: Assess the ability to think critically and analyze the topic. Look for evidence of logical reasoning, sound judgment, and the ability to evaluate different perspectives. A strong response should demonstrate the candidate’s ability to assess the strengths and weaknesses of different arguments or approaches.

Originality and Creativity: Consider whether the person brings unique insights or perspectives to their response. Look for original ideas or creative solutions that demonstrate a fresh approach to the topic. A strong response should showcase the candidate’s ability to think innovatively and offer unique contributions.

Coherence and Cohesion: Evaluate the coherence and cohesion of the response. Look for a clear and consistent line of thought throughout the answer. Assess how well connects ideas and concepts and how effectively they transition between different points or sections.

Language: Consider the candidate’s language proficiency and grammar usage. A strong response should be spoken in a professional and polished manner.

Engagement with the Audience: Assess how well the volunteer engages. Look for a response tailored to the audience’s specific needs or expectations. A strong response should be relevant, interesting, and capable of capturing the reader’s attention.

Remember that these evaluation criteria are guidelines, and the importance of each criterion may vary depending on the context and requirements. It’s also important to consider the specific instructions or guidelines provided when assessing a volunteer’s response.


Honesty Score is a collection of elements that we call “Checkpoints” that record a person’s bodily and verbal reactions when they are interviewed. Through detailed observation, it can be detected if they are lying. ETC Honesty Score.

Human perception is crucial when the Examiner evaluates the “Good Candidate” Score on the Scroll Bar in 5-positions, from best to worst, in the three Case sections:

Incident, Success, and Resume and background.



The examiner can evaluate trends and patterns of the Hard questions with the help of ALERTS, triggered by an automated algorithm at the question and session levels.

An ALERT or Warning is a clear indication to the examiner, for further review is required of the question video recording and the related metrics. Additional and complementary questions are recommended for deeper research and a fairly evaluation. 

We recommend that this assessment be executed at the question level, and the results will be automatically reflected in the “Diagnose & Evaluate” module.


Another key function for the examiner to evaluate is the “Honesty Score”, in 10 questions, based on “checkpoints” from the microexpressions of the eyes and face, verbal and body language, and the candidate’s circumstances that calculate the probability that the candidate is been honest.

The “Interview – Climate” section does not require evaluation at each of the 30 questions, but rather the Examiner’s assessment of the Climate Categories and the Values ​​based on the key metrics.

At the end of the “Diagnose & Evaluate” session, the Examiner can capture the results of Other Tests performed on the candidate to evaluate the exams’ overall impact.


It works as a central repository of another tests, that improves and supports better decisions.

The examiner evaluates the Candidate Skills that will update the “Behavior Fingerprint” module as support information.


The Examiner and Observers leave their comments in the notes area for each section.

If you have questions about the role of the Examiner, review the HELP – Guidelines link in your language, or please contact us at





Evaluation Criteria for the responses of a remote video interview of Volunteers according to customized questionnaires

Evaluating job volunteers’ responses in remote video interviews using customized questionnaires requires well-defined criteria to assess their suitability for the role. Here are evaluation criteria you can use to systematically assess candidate responses:

Relevance to the Question:

  • Does the response is directl and address the question asked?

  • Is the response aligned with the specific topic or competency being evaluated?

Clarity and Conciseness:

  • Is the response clear and easy to understand?

  • Does the volunteer provide concise and to-the-point answers without unnecessary elaboration?

Depth of Response:

  • To what extent does it elaborate on their experiences or examples?

  • Does the response provide sufficient detail to understand the context and actions taken?

Use of the STAR Method (Situation, Task, Action, Result):

  • Does the volunteer structure their responses using the STAR method when appropriate?

  • Can they clearly explain the situation, their role, the actions taken, and the results achieved?

Alignment with Company Values and Culture:

  • Does the response demonstrate alignment with the company’s values and culture?

  • Are there any indications of potential cultural fit or misalignment?

Problem-Solving and Critical Thinking:

  • Does the responder exhibit solid problem-solving skills in their response?

  • Are they able to think critically and provide innovative solutions when discussing challenges?

Communication Skills:

  • How effectively does the candidate communicate their thoughts and ideas?

  • Do they use appropriate language and tone in their responses?

Technical Competence (if applicable):

  • For technical roles, does the candidate showcase the required technical knowledge and skills in their responses?

  • Can they explain technical concepts clearly and confidently?

Adaptability and Learning Ability:

  • Does the candidate demonstrate adaptability and a willingness to learn from past experiences?

  • Are they open to trying new approaches or learning new skills?

Leadership and Teamwork: – If relevant to the role, and exhibit leadership qualities or effective teamwork in their responses? – Can they provide examples of successfully collaborating with others or leading initiatives?

Results and Achievements: – Are the outcomes and results clearly articulated in their response? – Do they quantify their achievements with specific metrics or figures?

Behavioral Traits and Soft Skills: – Does the volunteer’s response provide insights into their soft skills, such as adaptability, resilience, empathy, or conflict resolution abilities? – Are there indications of strong interpersonal skills?

Industry and Role-Specific Knowledge: – Does the candidate demonstrate a strong understanding of industry trends, best practices, and role-specific knowledge? – Are they up-to-date with relevant industry developments?

Professionalism and Ethical Considerations: – Does the responder exhibit professionalism and ethical behavior in their responses? – Are there any red flags or concerns regarding ethical or professional conduct?

Overall Impression: – Based on the responses, what is your overall impression of their suitability for the role? – How well do they align with the desired qualifications and competencies?

Follow-Up Questions (if necessary): – Are there areas in the candidate’s response that require clarification or further exploration? – Are follow-up questions needed to gain a more comprehensive understanding?

Collaboration and Team Fit: – Assess how well the volunteer responses align with the team’s dynamics and collaborative requirements. – Consider their potential contributions to team success and cohesion.

Adaptability to Remote Work (if applicable): – Evaluate the ability to adapt to remote work environments, including communication skills, self-motivation, and remote collaboration capabilities.

Cultural Sensitivity (if applicable): – Assess the awareness of and respect for diverse cultures and perspectives, especially if the role involves global or cross-cultural interactions.

Decision-Making and Judgment: – Evaluate the ability to make sound decisions and exercise good judgment based on their responses to situational questions.

Use these evaluation criteria as a framework to assess each candidate’s responses consistently and fairly. Assign scores or ratings to each criterion to facilitate the decision-making process and ensure that the selected candidate aligns with the job’s requirements and company culture.