Human Generated Data

Title

Untitled (Mask and Wig members wearing dresses and dancing on stage)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10669

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask and Wig members wearing dresses and dancing on stage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10669

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Human 95.7
Person 94
Person 93.7
Horse 91.9
Animal 91.9
Mammal 91.9
Person 91.2
Person 90.3
Person 89.9
Dance Pose 89.2
Leisure Activities 89.2
Crowd 87.3
Clothing 83.5
Apparel 83.5
People 83.2
Musician 82.1
Musical Instrument 82.1
Person 81.1
Helmet 78.5
Shorts 76
Person 73
Text 63.2
Person 62
Music Band 57.2
Suit 56.1
Coat 56.1
Overcoat 56.1

Clarifai
created on 2023-10-26

people 99.1
many 96.7
group together 94.2
group 94.1
military 91.6
uniform 89.6
man 89.1
soldier 82.7
war 78
illustration 75
wear 74.8
victory 73.4
adult 73.4
populace 72.3
crowd 71.2
spectator 68.4
outfit 67.6
woman 61.9
army 61.9
vehicle 61.2

Imagga
created on 2022-01-15

vehicle 24.1
man 16.8
male 14.2
sea 14.2
military vehicle 14
city 13.3
person 12.4
people 12.3
travel 12
beach 11.8
ocean 11.6
war 11.4
water 11.3
ship 11
aircraft carrier 10.4
landscape 10.4
sport 10.3
men 10.3
transport 10
outdoor 9.9
warship 9.8
outdoors 9.7
group 9.7
military 9.6
sky 9.6
clothing 9.3
aircraft 9.1
pedestrian 9.1
transportation 9
sand 8.7
uniform 8.7
day 8.6
winter 8.5
building 8.2
adult 8.2
vacation 8.2
conveyance 8
lifestyle 7.9
business 7.9
black 7.8
color 7.8
shore 7.4
tourism 7.4
bobsled 7.4
danger 7.3
professional 7.2
team 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

outdoor 97.6
text 96
transport 81.7
black and white 65.1
pulling 35.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Male, 62.9%
Sad 92.4%
Fear 2.2%
Confused 1.4%
Calm 1.4%
Surprised 0.8%
Disgusted 0.7%
Angry 0.6%
Happy 0.5%

AWS Rekognition

Age 49-57
Gender Male, 99.8%
Sad 58.3%
Surprised 15%
Confused 7.4%
Fear 6.1%
Angry 5.1%
Disgusted 3.6%
Calm 2.4%
Happy 2%

AWS Rekognition

Age 30-40
Gender Male, 85.7%
Surprised 75.9%
Fear 6.4%
Sad 4.1%
Angry 4%
Disgusted 3.1%
Confused 2.7%
Happy 2%
Calm 1.8%

AWS Rekognition

Age 35-43
Gender Male, 98.5%
Happy 40%
Calm 20.3%
Surprised 16.1%
Angry 11.1%
Sad 4%
Disgusted 3.6%
Fear 3.4%
Confused 1.5%

AWS Rekognition

Age 45-53
Gender Female, 96.2%
Sad 78%
Fear 6.1%
Calm 4.8%
Confused 4.5%
Surprised 3.7%
Disgusted 1.3%
Angry 0.9%
Happy 0.7%

AWS Rekognition

Age 37-45
Gender Male, 95%
Fear 20.5%
Surprised 20.2%
Sad 18.2%
Happy 18.2%
Angry 7.9%
Calm 6.2%
Disgusted 4.5%
Confused 4.3%

AWS Rekognition

Age 21-29
Gender Male, 95.1%
Calm 98.9%
Sad 0.3%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%
Confused 0.1%
Angry 0.1%
Happy 0.1%

Feature analysis

Amazon

Person 94%
Horse 91.9%
Helmet 78.5%

Text analysis

Amazon

21409
state

Google

3RA2-MAMTZAJ
3RA2-MAMTZAJ