Human Generated Data

Title

Untitled (seated performers in dance costumes)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5678

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated performers in dance costumes)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5678

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 98.6
Human 98.6
Person 95.7
Person 95.6
Person 94.7
Military 93.3
Military Uniform 92.2
Person 91.9
Person 90.6
Crowd 90.1
People 85.8
Clothing 85.3
Apparel 85.3
Helmet 85.3
Marching 74.9
Army 70.4
Armored 70.4
Person 67.8
Soldier 66.8
Musical Instrument 64.1
Musician 64.1
Parade 60.4
Officer 60
Advertisement 55.9
Poster 55.9

Clarifai
created on 2023-10-15

people 99.6
illustration 96.8
group 95.7
many 94.8
man 92.4
art 91.8
adult 91.3
child 87.1
crowd 86.6
music 86.6
veil 86
war 85.6
military 84.2
engraving 83.4
group together 83.4
wear 81.9
soldier 80.1
woman 79.9
weapon 78.6
monochrome 77.7

Imagga
created on 2021-12-15

perfume 48.6
toiletry 38.6
glass 25.1
container 14.7
black 14.4
vase 12.2
sexy 12
person 12
people 11.7
human 11.2
adult 11
statue 10.7
man 10.6
wedding 10.1
art 10
fashion 9.8
lady 9.7
portrait 9.7
bride 9.6
body 9.6
jar 9.5
party 9.4
grunge 9.4
vessel 9.3
old 9
life 9
water 8.7
model 8.5
male 8.5
event 8.3
sculpture 8.3
style 8.2
light 8
celebration 8
men 7.7
reflection 7.6
city 7.5
figure 7.5
decoration 7.4
game 7.1
women 7.1
love 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.4
sketch 94
drawing 91.1
group 68.4
person 61.5
posing 61.1
cartoon 51.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Female, 73.5%
Angry 68.5%
Happy 19%
Surprised 4.6%
Calm 2.9%
Fear 2.7%
Confused 1.1%
Sad 1%
Disgusted 0.3%

AWS Rekognition

Age 21-33
Gender Female, 64.2%
Calm 92.5%
Angry 4.5%
Sad 1.3%
Confused 0.9%
Happy 0.6%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 19-31
Gender Female, 50.6%
Angry 84.9%
Happy 7%
Surprised 3.9%
Calm 3.4%
Confused 0.3%
Sad 0.3%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 21-33
Gender Female, 53.8%
Calm 58%
Surprised 40%
Happy 0.7%
Angry 0.5%
Sad 0.3%
Confused 0.3%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 16-28
Gender Female, 86.9%
Calm 92%
Angry 2.6%
Sad 1.8%
Happy 1.5%
Surprised 1.3%
Fear 0.3%
Confused 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Helmet 85.3%

Categories

Imagga

paintings art 84.6%
interior objects 14%

Text analysis

Amazon

13911.
13911. -
-

Google

139 11. 139 || ·
139
11.
||
·