Human Generated Data

Title

Chicago

Date

1961

People

Artist: Harry Callahan, American 1912 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.483

Copyright

© Estate of Harry Callahan, Courtesy of Pace Gallery

Human Generated Data

Title

Chicago

People

Artist: Harry Callahan, American 1912 - 1999

Date

1961

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.483

Copyright

© Estate of Harry Callahan, Courtesy of Pace Gallery

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.1
Human 98.1
Clothing 98
Apparel 98
Person 97.9
Person 95.2
Person 91.2
Overcoat 88.3
Coat 88.3
Building 84
Architecture 84
Tower 84
Clock Tower 84
Military Uniform 68.7
Military 68.7
Sleeve 58.4
Officer 57
Suit 55.7
Shop 55.6

Clarifai
created on 2023-10-25

people 99.8
man 97.7
adult 94.5
portrait 94.1
one 94.1
wear 92.8
two 92.2
music 92.1
leader 91.5
war 90.7
military 88.6
administration 88.1
three 87.6
outfit 87
group 85.7
soldier 83.8
art 82.5
military uniform 79.1
woman 78.5
retro 76.1

Imagga
created on 2022-01-09

military uniform 40.6
uniform 38.8
weapon 35.1
clothing 28
man 24.8
black 21.1
male 19.8
person 19.7
covering 19
instrument 18.6
device 17.3
consumer goods 16.8
bow 15.6
statue 15.6
portrait 15.5
adult 14.9
people 14.5
religion 14.3
soldier 13.7
warrior 13.7
sword 13
art 13
military 12.5
dark 12.5
sculpture 12.4
musical instrument 12
gun 11.8
army 11.7
vintage 11.6
war 11.5
protection 10.9
face 10.6
fashion 10.5
old 10.4
power 10.1
music 9.9
faith 9.6
ancient 9.5
expression 9.4
catholic 8.7
model 8.6
attractive 8.4
rifle 8.3
silhouette 8.3
holding 8.2
history 8
posing 8
helmet 7.8
wall 7.7
musician 7.7
culture 7.7
god 7.6
cross 7.5
religious 7.5
monument 7.5
style 7.4
sport 7.4
church 7.4
commodity 7.4
microphone 7.3
pose 7.2
metal 7.2
hair 7.1
architecture 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.6
person 97.2
clothing 96.2
concert 92.2
man 85
musical instrument 77.9
black and white 72
drum 50.1
old 49.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 34-42
Gender Female, 93%
Fear 32.6%
Angry 26.8%
Sad 14.7%
Confused 12%
Calm 4.9%
Disgusted 4.5%
Surprised 3.9%
Happy 0.6%

AWS Rekognition

Age 21-29
Gender Female, 99.6%
Calm 91.1%
Fear 4%
Sad 3.5%
Angry 0.4%
Surprised 0.4%
Disgusted 0.3%
Confused 0.3%
Happy 0.1%

Microsoft Cognitive Services

Age 38
Gender Female

Microsoft Cognitive Services

Age 36
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%

Categories

Text analysis

Amazon

XILA
U
VII
III
U III XILA VII IN 9
9
IN