Reynaldo Jarro

Reynaldo JarroPosts

Hey, I'm a Computer Science/Business Administration 2024 graduate from the University of Southern California. I am very passionate about Software Engineering, Cybersecurity, and DevOps/SRE. I am working towards some certifications such as Sec+, AWS Security, and TCM Security's PNPT.

I have used a variety of programming languages, frameworks, software, and tools throughout the years to complete projects. Currently, I am working on two projects: Kalao, a LLM Chatbot that uses RAG based on results from nmap scans and API calls from NIST and SearchSploit for exploitdb to identify exploitable ports or systems in the network, and also working on a website to track card collection progress for a specific TCG called Chaotic that has a very niche community, at the moment I am working on wrapping up the public API for card image and information.

Currently Working On

Here is some of the things I am working on at the moment:

● Learning Go, Ansible, Terraform, and becoming better at managing Kubernetes

● Contributing to Open Source Projects

● Getting some CyberSecurity certificates such as Sec+, PNPT, and AWS Security

● Working on my own home-lab for networking, pentesting, and infrastructure practice

● Working on both Kalao and the Card Collection Tracker

Work Experience

Amazon - Software Engineer Intern

(Seattle, WA May 2023 - August 2023)

● Developed several design documents focusing on high & low level implementation details; managed each implementation throughout its entire lifecycle including planning, development & testing

● Designed and implemented a scheduled Data Consistency job that would check for consistency across four different data sources. The Data Consistency job runs comparisons across all four data sources, one of them being a Kafka event bus, (extensible design) to ensure data integrity. If mismatches are discovered, the system will create a detailed report outlining the inconsistencies. Additionally, the service will send a copy of the report via SNS and page the on-call engineer

● Responsible for managing all edge cases and the processes in which each is handled

● The Data Consistency job will be used for the next year to migrate most of the data in PubTech to newer more definitive infrastructure

● Tech Stack: AWS (S3, CDK, CloudFormation, EventBridge, DynamoDB, ECS, SQS, IAM, and Lambda), Java, Dagger, JUnit, TypeScript, Apache Kafka

Projects - Cybersecurity

Kalao

● Developing an open source LLM-framework focused on network vulnerability analysis tool for external testing that uses nmap & tcpdump, parses the data into XML files and which are later sourced for locally running LLMs. Also uses both NIST CVE API and SearchSploit for CVE and exploitation checks.

● Users can interact with a chatbot and ask about the overall results, possible targets, open ports, version & exploits

● Tech Stack: Python, oLLaMa, LlamaIndex, nmap, tcpdump, TCP/IP, Bash, SearchSploit

Bug Bounty

● Testing websites with appropriate policies for XSS, IDORs, Information Disclosures, and many other security issues via BurpSuite.

● I am part of huntr's community, the first AI/ML bug bounty platform, I test ML libraries, chatbots, and frameworks for security issues via a combination of tools such as BurpSuite and MLflow.

● Tech Stack/Tools and Skills: SQL Injection, BurpSuite, IDORs, XSS, MLFlow, SSRF, Web-Pentesting

CTFs

During my time at USC I participated in two CTF activities, one time as the Red Team Captain and another one as both Blue and Red Team.

● Red Team Captain: During my first CTF each group was split between a blue and red team. Blue team was responsible for creating a rudimentary bank website in which people can create an account and send money to others, while Red Team was responsible for hacking the banks of other teams. During my time as captain of the Red Team, I taught several group mates what SQL injections are and how to write payloads in their injections. I was responsible for network recon and identifying vulnerabilities, and at the very start of the activity I had noticed that the ssh port was opened. Once I connected to the port I ran a couple of customized scripts to run nmap and tcpdump, and through those results I realized that there was no encryption in transit, so all account creation (username and passwords) were in plain text in their requests. To exploit this vulnerability I wrote a several Bash and Python scripts that worked together to keep reading the HTTP requests of users, convert pcap results to CSV files, and then use those CSV files to access the accounts created and send the money to my account to increase my teams points.

● Red and Blue Team: During my second CTF activity we were responsible for creating hashing protocols to secure password information which would then be shared with other groups, so they could try and crack the encryption to gain points. I helped my team create and apply the hashing protocol to all of our passwords, and then once we received the encrypted passwords of other teams I lead the development of scripts for tools such as JohnTheRipper and for Dictionary Attacks.

● Tech and Skills: nmap, tcpdump, Wireshark, Bash, Python, OpenSSH, SQL Injections, Kali Linux, JohnTheRipper

Pentesting Simulation

● During my Ethical Hacking class were split in groups of two and responsible for trying to hack into four HackTheBox boxes. Before the pentesting began we had to deliver a plan on what tools, tech, and overall strategies we were planning on using, and after that was approved we could start the pentesting.

● In most boxes we used nmap to scan for ports, which most of them had at least one vulnerable port that could be exploited . Most of the time we would run the exploits and payloads through Metasploit. Once we had access to the machines we would try and get privilege escalation. The boxes had a mix of both linux and Windows OS.

● Tech and Skills: Metasploit, Windows Exploitation, Linux Exploitation, Privilege Escalation, nmap, Meterpreter, Python

Projects - Software Engineering/DevOps

Card Collection Tracker

● Designed the project end-to-end; currently working on wrapping up the REST API

● Developing a website for a small trading card game community in order to track game collection & progression; utilized BeautifulSoup with Python to scrape the image of each card from the web. The webpage will be hosted in Dreamhost, and data is securely stored in RDS & Cloudinary and registration will be handled by AWS Cognito

● Tech Stack: AWS (RDS and Cognito), Python, BeautifulSoup, Express, Node.js, React, PostgreSQL, REST API

Automated CRUD AWS

● Developed with a group a website for researchers that automates the creation of customizable CRUDs

● Designed the AWS Structure that automates and takes in the CRUD Design from users. Used DynamoDB and Lambda to take in Data, which was then transferred to S3 due to better storage for large amounts of data. The CRUD would be instantiated by Serverless

● Tech Stack: AWS (Lambda, S3, DynamoDB, and IAM), Node.js, Serverless, Java

The Movie Database-esque Website

● Led the development of a movie database website (similar to TMDB) leveraging an API that included endpoints related to searches, synopsis, reviews, and general information about movies

● Implemented the backend framework using Spring Boot and Java. The backend was consistent with a database system which stored the registered account, parsing the information received from the TMDB API, and eventually sending it to the front-end team leveraging the API we created with Postmate

● Tech Stack: Java, MySQL, Postmate, Spring Boot, React, HTML/CSS, Docker

NFL Wordle

● Lead the development of a wordle inspired website. The game was centered around guessing NFL Players, and it also had multiplayer options to play against other users.

● Responsible for the implementation and design of most of the front-end using Bootstrap, CSS, JavaScript, HTML, and Figma. THere was a lot of thought put into the design to ensure that users had a good experience interacting with the game.

● Collaborated with the backend team as well to develop the parsing of the data returned by the API responsible for delivering the NFL Players data. Also helped with the MySQL database architecture.

● Tech Stack: HTML/CSS, Bootstrap, JavaScript, Figma, MySQL


© Reynaldo Jarro.RSS