Human Generated Data

Title

Untitled (couple posed in studio for portrait, man in military uniform)

Date

c. 1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1833

Human Generated Data

Title

Untitled (couple posed in studio for portrait, man in military uniform)

People

Artist: John Deusing, American active 1940s

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 98.3
Clothing 98.3
Person 98
Human 98
Person 94.2
Costume 67.7
Flower 64.2
Plant 64.2
Blossom 64.2
Furniture 59.4
Performer 57.7
Dress 55.8

Imagga
created on 2021-12-14

person 25.2
man 20.4
people 17.8
clothing 17.8
fashion 16.6
male 15.6
portrait 15.5
mask 15.4
outfit 15.2
adult 14.5
dress 14.4
statue 14.1
body 12
elegance 11.7
active 11.7
model 11.7
performer 11.7
silhouette 11.6
art 10.9
sport 10.7
black 10.7
fun 10.5
leg 10.1
protection 10
industrial 10
radioactive 9.8
toxic 9.8
nuclear 9.7
style 9.6
sexy 9.6
gas 9.6
women 9.5
pretty 9.1
danger 9.1
stylish 9
dirty 9
stalker 8.9
radiation 8.8
soldier 8.8
destruction 8.8
protective 8.8
costume 8.7
architecture 8.7
hair 8.7
military 8.7
chemical 8.7
party 8.6
face 8.5
power 8.4
sculpture 8.4
action 8.3
figure 8.3
weapon 8.3
bride 8.3
human 8.2
dance 8.1
dancer 8
lifestyle 7.9
accident 7.8
wind instrument 7.7
industry 7.7
sky 7.6
historical 7.5
monument 7.5
holding 7.4
musical instrument 7.4
lady 7.3
history 7.2
happiness 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 97.7
clothing 89
text 79.1
person 77.8
statue 67.8
black and white 62.3
posing 43.9

Face analysis

Amazon

Google

AWS Rekognition

Age 13-25
Gender Female, 62.6%
Calm 79.7%
Confused 6.2%
Sad 5.3%
Happy 4.4%
Surprised 1.6%
Fear 1.1%
Angry 1.1%
Disgusted 0.6%

AWS Rekognition

Age 22-34
Gender Female, 50.4%
Calm 95.5%
Happy 1.4%
Sad 1.2%
Angry 0.8%
Surprised 0.7%
Confused 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98%

Captions

Microsoft

a person standing next to a vase 33.8%
a person standing next to a vase 26.1%
a person standing next to a vase with flowers in it 25.3%

Text analysis

Amazon

EVEANTH
19/07