It’s unclear whether the project will make it to any Apple products, but you can try it out now on GitHub.
By Adrianna Nine February 9, 2024
https://www.extremetech.com/computing/a ... age-editor
With the help of researchers at the University of California at Santa Barbara (UC Santa Barbara), Apple has developed an AI-powered image editor that responds to text-based commands. A demo is now live on Hugging Face, though it’s unclear whether the project will ever make it to any Apple products.
Apple’s model is called MLLM-guided image editing, or MGIE, with MLLM being short for “multimodal large language models.” This jargon means MGIE leverages a large language model capable of mixing natural language processing with image inputs, which is vital for any AI-based image editor. Rather than using sliders, filters, and other manual image-editing tools, MGIE users can simply tap out a quick text prompt that tells the model what they want their result to look like.