[Applescript] set miniatured property failed

Hi,

Recently we tested how to minimize app using Applescript as below picture and find some interesting phenomenon. After executing the script, if “miniaturized” marked in green (recognized as parameter), it failed to set minimize property. If “miniaturized” marked in purple, it is ok to set minimize property.

Because the syntax is exactly the same, could you help to guide us what is wrong with the script?

Sincerely, Thanks, YM

You are not addressing some generic "window" class, but different "window" classes belonging to a different processes, which just happen to have the same name. They may have different properties and actions. Take a look at those applications' respective dictionaries (cmd+shift+o in the Script Editor) and you'll notice "miniaturized" is in Safari dictionary, but not in Finder dictionary, for example. You can have consistency across the applications if you use "System Events" to minimize the windows:

tell application "System Events" to set the value of attribute "AXMinimized" of window 1 of process "Finder" to true

It works for "Finder", "Evernote", "Google Chrome", and any other app. Thanks for your suggestion.

Huge thanks to kmitko -- this works for TeamViewer, which unminimizes its main window based on unrelated system events (like checking if there are macOS updates). This allows me to run my weekly updates checker and not leave TeamViewer's window suddenly blotting out the rest of my screen.

[Applescript] set miniatured property failed
 
 
Q