Internet Security Auditors — Madrid, September 23, 2025
We are especially pleased to share that, despite not being a CSIRT or Blue Team, and in our first registration in the International CyberEx 2025 cyber‑exercise, organized by the OAS (Organization of American States) and INCIBE (National Cybersecurity Institute), we have reached 18th place in an especially competitive edition: 84 teams, 336 specialists from 17 countries, with almost 60 teams from the OAS sphere, more than 25 teams from Spain and other countries, and the special participation of 5 teams and 19 experts from the Spanish national selection of young cybersecurity talents (ECSC). The exercise was held in a team “jeopardy”‑style CTF format for 8 hours.
Competing against teams with CSIRT DNA and finishing Top 20 in our first participation confirms that our offensive processes and skills are up to the task. We are already applying that learning to how we design and execute our pentests.
Mario Valiente Catalán, Captain of the Internet Security Auditors team
The organization only publishes the Top 10; the rest of the positions remain anonymous, so we are sharing our result for transparency with the community.
About the International CyberEx 2025
The
International CyberEx seeks to strengthen response capabilities and cooperation among
OAS countries and teams invited by
INCIBE. The 2025 edition is held in English and Spanish, with teams of 3–4 people and a “jeopardy”‑style CTF environment (challenge categories with different scoring).
Our experience in the CTF: actionable learnings
In the CTF we achieved frictionless coordination, guided by a captain who set priorities and members working in parallel (research, exploitation, and documentation), with clear tasks. That pace strengthens our Red Team.
Testable hypotheses and focus on the objective. We covered complementary disciplines — reverse engineering (Reversing), cryptography, forensics, and steganography — to understand the end‑to‑end attack chain, from enumeration to obtaining the flag.
As an experienced Red Team, this approach is not new to us; we have been working for years with a “purple team” mindset that integrates the defensive view into our exercises.
Participation in the CTF acted as a demanding testing ground to validate and reinforce already consolidated practices: we refined test cases and acceptance criteria that we already use in pentesting, and confirmed the robustness of our detection hypotheses.
What does this mean for our clients?
In the CTF we validated real capability under pressure. Before each challenge we activated a roadmap and applied pauses in 45‑minute cycles with decision points, which prevents dispersion and maximizes progress. That discipline is what we transfer to our pentesting projects through a repeatable methodology: we start from enumeration and inventory of surfaces and services, formulate exploitation hypotheses, execute controlled exploitation in agreed windows, and convert flags into reproducible evidence (proofs, artifacts, hashes) ready for audit and retest.
In addition, the competition forced us to prioritize by impact in real time; that practice becomes remediation backlogs oriented toward risk (criticality, exploitability, and exposure), with owners and target dates, accelerating the closure of critical findings and reducing rework between security, development, and operations.
What does your organization gain from all this?
Top 20 in our first participation — and without being a CSIRT/Blue Team — because we approached the CTF with a defensive mindset. We apply that approach to our clients so that the exercise not only finds flaws but also improves detection, response, and compliance.
Red Team with a Blue Team mindset▪️Detection hypotheses before attacking: for each planned TTP (MITRE ATT&CK) we define what the SOC should see, where (SIEM/EDR/Firewall/IdP), and with what threshold. This way the test validates both exploitation and observability.
▪️Instrumentation and traceability
▪️End‑to‑end validation: we measure detection → containment → eradication → recovery and recommended actions per phase.
▪️Transferable learning: each technique tested generates runbooks and lessons learned for incident response and hardening, preventing the finding from recurring in production.
▪️Pentesting with business focus
▪️We plan by exposed surfaces (web/API/mobile/network/AD/cloud/OT), prioritizing what impacts revenue, continuity, or compliance. We execute in controlled windows and with measurable impact (residual risk, criticality, remediation effort), without affecting production.
Acknowledgments Thanks to the
OAS and
INCIBE for the initiative, organization, and technical support. The experience has been excellent and enriching for our entire team.
References