Ray Tools Jump Synthesis Rubric
Template for capturing and structuring insights from each call.
| Field | Description | |---|---| | CALL DATE | YYYY-MM-DD | | DURATION | e.g. 45 min | | PARTICIPANTS | Names / roles | | RAY TOOL(S) TESTED | Tool name(s) |
1 Call Summary
High-level overview of what was discussed, decisions made, and key outcomes from the call.
| Field | Description | |---|---| | SUMMARY (3-5 SENTENCES) | Capture the purpose of the call, what was demonstrated, who was present, and any decisions or next steps agreed upon. |
2 Ray Tool Testing Capture
For each tool tested on the call, capture direct quotes, context, the problem being addressed, and who said it. Repeat this block per tool if multiple were tested.
| Field | Description | |---|---| | TOOL TESTED | Tool name | | PROBLEM / USE CASE | What problem was the tool solving? | | CONTEXT | Workflow or scenario where the tool was used |
KEY QUOTES
"Exact quote from the call about the tool experience..." — Speaker name, role | Timestamp
"Exact quote from the call about the tool experience..." — Speaker name, role | Timestamp
| Field | Description | |---|---| | OUTCOME / VERDICT | Did it work? What happened? | | FOLLOW-UP NEEDED? | Any bugs, requests, or iterations |
3 Automation Opportunities
Identify 3 automation opportunities surfaced during the call — processes, workflows, or pain points that could be addressed with tooling.
| # | Opportunity title | Description | |---|---|---| | 1 | Opportunity title | Description of the workflow or pain point, who mentioned it, and the potential impact of automating it. | | 2 | Opportunity title | Description of the workflow or pain point, who mentioned it, and the potential impact of automating it. | | 3 | Opportunity title | Description of the workflow or pain point, who mentioned it, and the potential impact of automating it. |
4 Top Opportunities Worked On
From: Opportunities tab
Pull from your opportunities tracker to highlight which existing opportunities were actively worked on or discussed during this call.
| Field | Description | |---|---| | OPPORTUNITIES REFERENCED | List the opportunity name/ID, current status, and any progress or blockers discussed on the call. |
5 External Feedback Captured
From: Feedback tab
Surface any customer or external stakeholder feedback that was shared during the call.
| Field | Description | |---|---| | FEEDBACK SOURCE | Customer name, account, or channel | | SENTIMENT | Positive / Neutral / Negative | | FEEDBACK DETAIL | What did the customer say? What was the context? Any action items? |
6 Tools Improved
From: Heuristics tab
Note any tools that were updated, tuned, or improved based on heuristics tracked in your heuristics tab.
| Field | Description | |---|---| | TOOL IMPROVEMENTS | Which tool was improved, what changed (prompt, logic, UI), and which heuristic triggered the change. |
7 Talk Time Breakdown
Summarize approximate talk time per participant. Use transcript word counts or meeting tool analytics if available.
| Participant | Talk Time | |---|---| | Participant A | 45% | | Participant B | 30% | | Participant C | 15% | | Participant D | 10% |
Note: Talk time is approximate, derived from transcript word count or meeting analytics. Adjust participant names and percentages per call.
8 Synthesis Questions — User/Tester Response
Capture the response of the person using/testing the tool (typically the CEO or design lead). These questions should be answered for each tool tested.
Copy-paste ready (no line breaks):
Problems | Was anything unclear or hard to act on? | Which section gave you the most useful insight? | What's the one thing we could improve? | Does this tool save you time? | Implications / Next Steps
Per-tool response capture:
| Field | Description | |---|---| | TOOL TESTED | Tool name | | TESTER | Name / role of the person using the tool | | Problems | What problems did the tester encounter? Usability issues, confusion, broken elements, missing context? | | Was anything unclear or hard to act on? | What labels, data, or interactions were confusing? What couldn't the tester interpret without extra explanation? | | Which section gave you the most useful insight? | Which part of the tool output was most valuable to the tester? What data or view did they respond to most positively? | | What's the one thing we could improve? | The tester's single highest-priority improvement request. Capture their exact framing if possible. | | Does this tool save you time? | Did the tester validate time savings? If so, what was the before/after comparison? Capture specific numbers. | | Implications / Next Steps | What did the tester say should happen next? Concrete actions, feature requests, or workflow changes they called out. |
Note: Repeat this block per tool if multiple tools were tested on the call. Focus on capturing the tester's direct experience and language, not team-level discussion.