When shopping for cosmetic products, finding the right makeup color can be quite confusing. Many virtual makeup tools overlay effects on a phone screen, making it easier to experiment, but they often seem artificial and can be overwhelming with hundreds of color options.
The results are also displayed on a flat screen, which rarely looks like what makeup looks like on real skin under natural light. That’s why researchers have developed a system that can project makeup directly onto your face simply by describing the look you want.
Scientists at the Institute of Science Tokyo have developed an experimental technology that uses artificial intelligence with projection mapping to simulate makeup on a real face.
Instead of physically applying cosmetics or relying on typical augmented reality filters, the system beams makeup directly onto the user’s skin, allowing them to see what it would look like under real light.
AI translates moods into real makeup projections
The system is based on what researchers call an impression-driven text-to-makeup color model. Instead of manually selecting the shades, you can simply describe the mood or mood you’re going for.
You can say “sakura in spring,” “night rose,” or even “autumn forest with warm sunlight.” The AI interprets this description and generates a reference image that captures the mood.
It then creates five recommended color palettes for key areas including cheeks, eyeshadow and lips. The shades created are refined using real cosmetic color distributions, making them look more like real makeup.
Once the colors are created, a projection system displays the makeup directly on the user’s face. A high-speed camera constantly tracks facial movements, while motion detection ensures the projection stays focused on the eyes, lips and cheeks.
Because the system adjusts in real time, makeup stays in place even when the user turns their head or changes their facial expression.
This real-time tracking is important because traditional virtual makeup filters can cause interference or slip off the face when someone moves quickly. In this setup, projection mapping ensures that the design remains stable, making the simulated makeup behave more like real cosmetics.
Possible real-world uses for AI-assisted makeup projection
This technology opens up interesting possibilities that go beyond personal beauty experiments. Makeup artists could quickly test bold color combinations before applying real products.
Fashion designers were able to view makeup concepts for runway shows. Beauty brands may even use similar systems in stores to help customers discover new looks without touching a single brush.
Researchers believe this technology could make experimenting with cosmetics easier and help people discover makeup styles that better suit their personal tastes.




