MarcX
Native macOS app that seamlessly integrates multiple LLM models with advanced interaction capabilities to accelerate personal workflow.
Skills
Swift, Xcode, macOS Development, CoreML, Vision Framework, Voice Recognition, LLM Integration, UI/UX Design
MarcX was born out of frustrations with existing LLM apps like ChatGPT, which presented several limitations: restricted voice control, cumbersome screen sharing requiring manual screenshots, limited model selection, and inability to control the user's computer.
This native macOS app addresses these limitations with a chat box overlay that can be instantly activated via a keyboard shortcut (option+space). Holding the shortcut activates voice control for hands-free interaction. The app includes dedicated buttons for screen sharing and web search grounding, significantly streamlining the workflow.
What truly sets MarcX apart is its ability to manipulate the user's screen - for example, filling out forms automatically through voice commands. This functionality creates a seamless integration between AI assistance and computer control.
Building this as a native Swift application in Xcode presented challenges but ultimately demonstrated the benefits of native app development. Future development plans include more extensive computer control capabilities and advanced voice command recognition that would enable commands like "look at my screen and answer question 25" without requiring manual button presses.

MarcX image