Apple Develops AI Model That Can Edit Images With Prom


 We know Apple will offer generative AI features in iOS 18, as the update will be the biggest in their history. Recently Apple researchers with the University of California, Santa Barbara have come up with a language model that can edit images with a simple prom.



MGIE, or MLLM-Guided Image Editing is AI with MLLM (multimodal large language model) technology. It is trained to have Photoshop-like editing capabilities, improve image quality, simple or complex edits with just a simple prom. For example, enter a "bluer sky" promo - the sky in the picture will be brighter and bluer. There is a picture of a pizza, enter the "make it healthier" promo - the meat on the pizza will be replaced with vegetables.



Another example that this survey shows is, a dark picture can be described or a distracting object in the background can be removed with a simple text. MGIE works by analyzing the picture given by the user, then according to the received prom it will imagine how the result will happen.



MGIE is accessible on GitHub or Hugging Face for developers to try. This survey does not reveal what Apple plans for it. Will it be a feature in iOS or just an early test for a future project.

Previous Post Next Post

Contact Form