Human Generated Data

Title

Untitled (group of people in costumes under tent)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7676

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people in costumes under tent)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7676

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.7
Human 99.7
Person 99.4
Person 99.2
Person 98.6
Person 98.4
Person 98
Person 97.8
Person 96.6
Person 95.8
Clothing 94.1
Apparel 94.1
Person 92.5
Crowd 85.2
Musical Instrument 81.2
Musician 81.2
People 75.5
Face 72.5
Person 70.8
Text 69.6
Female 68.8
Overcoat 66
Suit 66
Coat 66
Person 64.9
Leisure Activities 61
Photography 60.8
Photo 60.8
Music Band 59.8

Clarifai
created on 2023-10-26

people 99.9
group 98.1
many 97.2
adult 96.5
man 96.3
music 95.3
group together 95.2
woman 95.1
wear 93.3
administration 93.1
child 84.8
leader 83.9
crowd 83.8
wedding 82.4
several 81.6
recreation 80.5
musician 80.1
dancing 79.3
ceremony 78.3
movie 77.7

Imagga
created on 2022-01-09

stage 38.3
platform 28.6
life 17.3
people 16.2
man 15.4
old 13.9
building 13.4
adult 13.4
person 13.3
male 11.3
human 11.2
men 11.2
black 10.9
silhouette 10.7
religion 10.7
art 10.7
church 10.2
architecture 10.1
city 10
music 9.9
vintage 9.9
hand 9.9
night 9.8
scene 9.5
love 9.5
party 9.4
spectator 9.2
travel 9.1
business 9.1
wall 8.8
celebration 8.8
light 8.7
statue 8.7
crowd 8.6
historic 8.2
dress 8.1
women 7.9
ancient 7.8
bride 7.7
fashion 7.5
house 7.5
religious 7.5
tourism 7.4
event 7.4
world 7.2
history 7.1
cool 7.1
interior 7.1
sculpture 7.1
happiness 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 92.9
text 91.7
person 88.7
posing 87.7
concert 75.2
man 72.9
group 62.7
black and white 60.4
people 56.4
line 21.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Male, 82.3%
Sad 49.1%
Happy 48.5%
Confused 0.9%
Calm 0.6%
Fear 0.3%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%

AWS Rekognition

Age 54-62
Gender Male, 99.7%
Calm 34.2%
Happy 26.4%
Fear 9.7%
Disgusted 8.2%
Sad 7.3%
Angry 6.9%
Surprised 4.3%
Confused 3%

AWS Rekognition

Age 35-43
Gender Male, 99.8%
Happy 76.5%
Confused 7.7%
Calm 6.5%
Angry 3.3%
Sad 2.1%
Surprised 1.9%
Disgusted 1.4%
Fear 0.5%

AWS Rekognition

Age 21-29
Gender Male, 55.5%
Calm 79.8%
Sad 10%
Disgusted 2.3%
Happy 2.2%
Surprised 1.7%
Confused 1.7%
Fear 1.4%
Angry 0.8%

AWS Rekognition

Age 23-31
Gender Female, 88.8%
Sad 70.4%
Angry 18.4%
Fear 2.8%
Disgusted 1.9%
Surprised 1.8%
Calm 1.8%
Confused 1.7%
Happy 1.2%

AWS Rekognition

Age 45-53
Gender Male, 99.9%
Sad 69.5%
Confused 12%
Calm 6.5%
Happy 5%
Disgusted 3.2%
Fear 1.5%
Surprised 1.2%
Angry 1.1%

AWS Rekognition

Age 54-62
Gender Male, 94.2%
Calm 44.2%
Happy 21.6%
Confused 10.1%
Sad 8.9%
Disgusted 5.3%
Surprised 3.5%
Fear 3.3%
Angry 3.1%

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 62.2%
Happy 28%
Sad 3.8%
Disgusted 1.6%
Fear 1.5%
Angry 1.4%
Confused 0.9%
Surprised 0.5%

AWS Rekognition

Age 22-30
Gender Male, 79.3%
Calm 97.1%
Sad 1.4%
Confused 0.5%
Happy 0.3%
Angry 0.2%
Fear 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 21-29
Gender Female, 54.2%
Calm 56.8%
Happy 13.4%
Angry 13.2%
Sad 5%
Confused 3.5%
Disgusted 3.3%
Surprised 2.4%
Fear 2.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Categories

Text analysis

Amazon

of