Hi! My name is Carlos. My current role, currently 6 months of experience (updated: September 2025), involves video editing, as well as developing applications that integrate VR/AR/AI technologies, continually expanding my expertise in both fields.
In my previous role, I accumulated 2 years of experience in developing mobile applications, including games and digital twin technologies using mobile devices. I am particularly passionate about gaming and have undertaken several personal projects using Unity. These projects span across 2D, 3D, AR, mobile, and educational applications, demonstrating my versatility and enthusiasm for game development.
I have a Bachelor's degree in Computer Engineering and Master's degree in Computer Engineering with a focus on Mobile Computing, I am confident in my ability to contribute effectively to your team.
I’m also enthusiastic about video editing and production , including color correction, audio, transitions, visual effects, lighting, and framing. Currently, I took an online course in high-quality video production to enhance my skills, adding to what I’ve learned through self-study.
Thank you for your time!!!
Skills:
Unity, Unity2D, Unity3D, Unity Multiplayer, Photon, Relay, Augmented Reality, Virtual Reality, Vuforia, ARcore, Digital Twin, Artificial Intelligence, HeyGen, Synthesia, C, C#, Javascript, Java, Python, HTML, CSS, TypeScript, PHP, SQL, MySQL, MongoDB, Full-stack developer, Front-end developer, git, websockets, vue.js, niantic studio (8th wall), visual studio, visual studio code, photoshop, vegas pro, capcut, davinci resolve, adobe after effects, image editing, video editing, video producer, video effects.
What i'm doing
Video Editor, Web Developer, VR/AR Developer
. Video Editor: Creation of presentation videos, AI-enhanced content (e.g., avatars, animated text), product demonstrations, and promotional materials.
. VR/AR Developer: Development of augmented reality application for enhanced workplace training; Development of a Digital Twin XR system that enables users to interact with virtual and real equipment through virtual elements (e.g., using VR/AR glasses to control and monitor objects, virtual environment for training, and real environment for user engagement and help).
. Web Developer: Design and development of custom Massive Open Online Courses (MOOCs) with videos and quizzes, and personalized learning features; Creation a chatbot capable of responding to course-specific questions, using text-to-speech, speaking avatars, and integrated quiz functionality.
Game Development
Implementation of games using Unity for both PC and mobile platforms, including 2D and 3D games, multiplayer, serious games, augmented reality.
Courses
. Build A Multiplayer Augmented Reality (AR) Game With Unity - UDEMY
. The Complete Java Certification Course - UDEMY
. Unreal Engine 5 C++ The Ultimate Game Development Course - UDEMY
This study presents a Digital Twin (DT) Augmented Reality (AR) system for UAVs (drones)
that enables users to visualize, control, and retrieve both virtual and real
UAVs for wildfire prevention and monitoring through a mobile device. The
proposed DT-based mobile application leverages Augmented Reality
and includes four key services: UAV Flight Data Generator, Visualization,
Control, and Retrieval. To facilitate the deployment of UAVs,
GPS coordinates were used to regulate their positioning and movement.
This method allows users to test and train UAV control in a virtual
environment before moving to real-world scenarios, reducing risks and costs.
The proposed DT system leads to the following contributions:
• Digital Twin-based mobile application to visualize, control and retrieve UAVs using AR;
• Autonomous UAV data attribution and movement using external UAV data for real-time remote visualization;
• Virtual environment for users to test and train UAV control without risks;
• Implementation of UAVs’ DT to operate in real environments;
• Hybrid approach to allow the combination of real and virtual environments;
• Integration with a web application and Gazebo simulator for UAV management (Websockets).
This application contains the following developed features:
• Augmented Reality (AR) visualization of UAVs along with their associated information;
• A comprehensive list of UAVs, both virtual and real, for management and selection;
• Multiple commands for managing real UAVs efficiently;
• Manual control of UAVs using joysticks for directional movement and rotation;
• Live visualization of the UAV's camera feed;
• Integration with a map displaying the user's location;
• Identification and retrieval of UAVs experiencing low battery or other status issues.
The manual control of unmanned aerial, terrestrial or marine or vehicles is a very common activity today.
However, the control of multiple vehicles at the same time is a complex task that requires a high level of skill and attention from the operator.
In this context, the use of augmented reality (AR) can be a valuable tool to enhance the operator's situational awareness and improve the control of multiple vehicles.
This project presents an Augmented Reality Multi-Vehicle Controller Service (ARMVC) that allows users to control multiple unmanned vehicles simultaneously using a mobile device with AR.
Includes a virtual environment for users to test and train UAV control without risks, and a real environment for users to control real vehicles using joysticks and other commands.
The ARMVC system is designed to be user-friendly and intuitive, allowing users to easily visualize and control multiple vehicles in real-time.
The proposed system has the following features:
• Create and manage user accounts
• Manage user vehicles
• Manage and visualize the details of a specific vehicle;
• Manual control of UVs (Unmanned Vehicles) and UAVs (Unmanned Aerial Vehicles) in real/simulated scenarios through the mobile application;
• Vehicle registration and control authorization mechanism through Augmented Reality
• Administration Web application
The aim of this project is to offer a multiplayer Four in a Line game, built with HTML, CSS, and JavaScript,
that uses Docker for local play and Terraform and Google Cloud for online play (cloud). Four in a Line, a
classic two-player strategy game, in which the participants strive to create a sequence of four tokens on
a grid. As it is a solved game, the first player can always win by utilizing the right strategies, making it
ideal for players of all skill levels. The game is developed as a web application, following the project
requirements, which include decoupling it into three microservices, provisioning with Terraform, and
utilizing automated deployments from a Git repository.
The project is organized in a structured file system that separates the client-side, server-side, and
infrastructure elements. The system comprises a client, WebSocket server, and REST API, all
interconnected, with MongoDB serving as the database for the REST API, MongoDB local and MongoDB
Atlas for cloud. The client side deals with the user interface and game logic, whereas the server side
handles the WebSocket server, database connection and REST API to enable real-time communication
between players and preserve game data. By using contemporary technologies such as Node.js,
Express.js, and MongoDB, this project delivers a seamless and captivating gaming experience for all
users.
The addition of Docker and Terraform streamlines the deployment process offline and online and allows
for easy scaling of the application. In this project, Docker is used to containerize each microservice (client,
server, and database) to enable smooth deployment and testing in a local environment and Terraform is
used to provision and manage the necessary project components on Google Cloud Platform.
The proposed system has the following features:
• Creating the game code (HTML, CSS, JavaScript) by communicate with all 3 services with each other (client, server and database), and ensuring the game is playable for everyone, using rooms.
• Ensuring that the Docker containers were configured correctly and communicating with each other.
• Configuring the triggers for each component in GCP and connecting in terraform for the different services (client and server).
• Using two different databases for the different scenarios (MongoDB locally and MongoDB Atlas cloud)
• Creating extra settings (ex: cmd.sh) in project for google cloud detecting the project and communicate with the client and server
• Ensuring the communication didn’t fail when users are playing the game, and other players could see the winners score when entering the game in another room.
This is an e-commerce application built using the MERN stack (MongoDB, Express, React, Node.js).
The application allows users to register, log in, add products, search, update, and delete products.
Authentication is implemented using JSON Web Tokens (JWT) for secure access.
This is a projects to learn about the MERN stack and its components, and to practice
building a full-stack web application with user authentication and product management features.
The proposed system has the following features:
• Authentication and Authorization:
User registration and login.
JWT-based token storage for route protection.
• Product Management:
Add new products.
List products belonging to the logged-in user.
Search for products by name, brand, or category.
Update and delete products.
• Frontend:
Built with React to provide a modern user interface.
The main objective of this project is to develop a clinical enterprise application for managing biomedical data and signals of individuals with cardiovascular diseases.
This repository showcases the front-end (client) of the application, built with NUXT.JS, while the backend is implemented in JAVA, and the environment is managed using Docker.
The proposed system has the following features:
• Role-based features: The system provides different functionalities for administrators, healthcare professionals, and patients, ensuring appropriate access and permissions.
• Management operations (CRUD): Users can create, edit, delete, and restore entities such as patients, professionals, administrators, biometric data types, prescriptions, and observations.
• Health data monitoring: The platform manages biometric data, prescriptions, observations, and related documents, allowing healthcare professionals to track patient information.
• Dashboards and statistics: Each role has a dashboard displaying relevant information, such as recent prescriptions, biometric data, and system statistics.
• Notifications and communication: The system sends email notifications for important events, including account changes, new prescriptions, observations, and patient–professional associations.
• Additional system features: Includes soft deletes, document uploads, data import/export, responsive interface, and data classification updates.
This study presents a Digital Twin (DT) Web system for UAVs (drones)
that enables users to visualize, control, and mission planner for both virtual and real
UAVs for wildfire prevention and monitoring through a mobile device. The
proposed DT-based web application includes four key services: UAV Flight Data Generator, Visualization,
Control, and autonomous movement through GPS coordinates. To facilitate the deployment of UAVs,
GPS coordinates were used to regulate their positioning and movement.
This method allows users to test and train UAV control in a virtual
environment before moving to real-world scenarios, reducing risks and costs.
Project Paper: (In Progress)
The proposed DT system leads to the following contributions:
• Digital Twin-based web application to visualize, control and mission planer UAVs using Google Maps API;
• Autonomous UAV data attribution and movement using external UAV data for real-time remote visualization;
• Virtual environment for users to test and train UAV control without risks;
• Hybrid approach to allow the combination of real and virtual environments;
• Integration with Gazebo simulator for UAV management (Websockets).
This application contains the following developed features:
• Drone Markers (using Google Maps API) for visualization of UAVs along with their associated information;
• A comprehensive list of UAVs, both virtual and real, for management and selection;
• Multiple commands for managing real UAVs efficiently;
• Manual control of UAVs using the key arrows for directional movement;
• Live visualization of the UAV's camera feed;
• Autonomous Movement in a certain area using virtual and real UAVs.
The system allows trainers to create training scenarios based on immersive experiences using augmented reality technologies.
Both the trainer and the trainee have access to specific applications on certain platforms, enabling them to perform their tasks.
The system is composed of two applications that share content via a storage service:
- TCA (Training Configuration Application) a desktop application for trainers, acting as a configurator/editor of training scenarios;
- ARTA (AR Training Application) a mobile application used by trainees to experience the training scenarios created by trainers in the TCA application;
- TSIS (Training Session Information System) – a server-hosted service for storing the created training scenarios and providing access and submission endpoints to both applications, according to their respective requirements.
This personal project aims to explore a novel concept by combining 2D and 3D elements in a single game.
The player controls two characters: one can manipulate 2D physical objects, while the other can handle 3D physical objects.
The main goal is to solve puzzles and defeat enemies to progress through the game.
These puzzles require both characters to work together, using both 2D and 3D objects to succeed.
Each character can only move objects of a specific color:
• Green - 2D Physics and Objects
• Blue - 3D Physics and Objects
The game includes:
• 2D and 3D dimensions (the camera perspective changes depending on the character currently in use)
• Puzzle-solving and enemy combat
• Abilities such as jumping, dashing, attacking, and moving blocks
• Game OnlyUp Platform Style, with 3 different scenarios
• Abilities such as jumping, speed, dash, homing attack, double jump
• Shop to customize the character and scenario
This project is a mobile 2D game designed to help kids learn about nutrition through play.
The game provides strategies for children to understand the nutritional value and properties of various foods,
including proteins, fiber, carbohydrates, and fats.
The main game includes multiple levels where players need to:
- Collect all essential nutrients (proteins, fiber, carbohydrates, and fats) by catching healthy foods until the indicators turn green, signifying good health.
- Unlock two new fruits at each level, increasing the difficulty while expanding knowledge about different foods.
- Navigate levels containing both good and bad fruits, requiring players to distinguish between healthy and unhealthy options.
Additional games and features include:
- Main Game (Level Phased)
- Cooking Smoothie (Learn to shop for ingredients, cut them, and make smoothies)
- Library (A resource to help players understand the nutritional value of various foods)
This project is a mobile augmented reality game where players build their own castle and aim to destroy the enemy's castle.
The player who successfully destroys the opponent's castle wins.
The multiplayer was developed by using Photon Engine.
- In the first phase, the player must build their castle with two blocks in height and four blocks in length on each of the four sides.
- In the second phase, both players have a limited number of cannons to place on their castle walls.
- The third phase involves a timed confrontation where players fire cannons at each other's castles within a time limit.
- In the fourth phase, players must rebuild their castles to close any openings within a set time limit (reducing each time). Phases 3 and 4 repeat until one player is unable to close their castle, resulting in their defeat.
This games contains the following developed features:
- Main Game (Multiplayer - 2 players)
- Creative (Unlimited block placement and shooting practice)
- Tutorial (Step-by-step learning of game phases)
- Shop (Earn coins in the main game to customize your castle layout)
This game is a 2D multiplayer adventure where players must work together to progress through levels and collect items in Simple Land.
Players can use blocks to aid each other by moving and placing them on the map with a swipe of the finger.
The multiplayer functionality is powered by Unity Relay Networking.
• Create and join lobby
• Captain role (can insert, move, and enlarge blocks in the game)
• Other players (cannot manipulate blocks)
• Levels with unique abilities (super jumping, dashing, etc.)
The game also features:
• Main Game (Multiplayer: 1-8 players)
• Minigames (Singleplayer: e.g., meteor escape, Pac-Man style challenges)
• Shop (Purchase clothes for characters)
• Achievements (Earned by playing the main game)
Video Editing (HeyGen, Capcut), Web Development (Moodle)
Developed comprehensive online courses (MOOCs) hosted on Moodle, integrating interactive quizzes and multimedia content.
Created instructional videos using HeyGen (AI avatars for narration) and CapCut.
The courses covered various topics and were designed to support self-paced learning with a focus on accessibility and interactivity.
The MOOCS contains the following developed features: :
- Educational videos created with AI avatars using HeyGen, including animated transitions between explanations to enhance student engagement.
- Video editing and enhancement using CapCut, including promotional content and refinement of HeyGen-produced videos.
- Customized Moodle environment with HTML, CSS, and JavaScript to create interactive elements such as buttons, dynamic actions, and embedded quizzes.
- Designed quizzes to reinforce learning and help students grasp key concepts.
This project is an intelligent chatbot developed using Hugging Face technologies.
It is designed to answer students’ questions about a specific subject, making learning more interactive and accessible.
The chatbot can be seamlessly embedded into other websites, allowing easy integration into educational platforms, course pages, or institutional portals.
The chatbot contains the following developed features:
- Answers course-specific questions posed by students
- Text-to-speech functionality to enhance audio output
- Speaking avatars that vocalize the answers (Lip-sync)
- Quizzes with course-specific questions to help students learn more effectively
This project is a Digital Twin XR system, made in Unity and using Meta Quest 3, that enables users to interact with both virtual and real equipment through immersive technologies. By using VR/AR devices, users can control and monitor physical objects, train in virtual environments, and engage with real-world systems enhanced by virtual elements.
The goal is to make working with real equipment easier, more intuitive, and efficient by providing guided tutorials, visual indicators, and interactive support.
This project contains the following development intregations:
- Virtual Reality (VR): Allows users to safely learn how to operate industrial equipment through guided tutorials and Digital Twin simulations, enabling the exploration of “what-if” scenarios without the risk of real-world errors.
- Augmented Reality (AR): Delivers real-time information and step-by-step guidance, helping users operate industrial equipment more efficiently by overlaying relevant data directly onto the physical environment.
- Firebase Integration: Supports the storage and retrieval of user data, enabling personalized scenarios or improving the overall learning experience.
This project provides this current features:
- Access to a dashboard containing categories and items.
- Item selection with the ability to preview and customize elements in real time, including colors, text, images, and interface components.
- Augmented Reality visualization, allowing items to be placed in the real world with the ability to move and resize them.
- Item editing at any time, enabling quick adjustments during use.
- Ability to return to the application and restore the scenario using spatial anchors.
- Manipulation of industrial equipment with interactive panels in real time
- Guided interaction step by step with visual indicators to assist users in operating the equipment effectively. (Future Development)
- Audio messages and alerts for error or anomaly situations (Future Development)