Human Generated Data

Title

Untitled (Mask and Wig members dancing to piano)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8225

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask and Wig members dancing to piano)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8225

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.1
Human 99.1
Person 98.9
Person 98.8
Person 98.4
Person 97.7
Person 96.9
Person 93.6
Person 93.4
Person 86.1
Person 85.6
Shorts 85.4
Clothing 85.4
Apparel 85.4
Chair 82.8
Furniture 82.8
Art 82.3
Person 81.9
Person 79.9
Flooring 78.2
Person 76.1
Person 76
People 72.7
Person 70.6
Person 69.1
Floor 65.7
Sitting 64.4
Indoors 63.3
Person 61.8
Crowd 61.5
Text 61.4
Drawing 58.8
Portrait 56.2
Photography 56.2
Face 56.2
Photo 56.2
Female 56.1
Person 43

Clarifai
created on 2023-10-25

people 99.9
man 96.7
group 96.4
many 95.8
group together 95.2
wear 95.2
adult 95.2
woman 95.1
child 92.2
boy 91.1
administration 88
education 86.5
recreation 84.9
music 84.3
art 81.4
athlete 80.3
uniform 79.3
adolescent 78.8
bench 77
dancer 74.3

Imagga
created on 2022-01-08

people 20.1
black 19.2
building 17.9
silhouette 16.5
city 15
architecture 14.8
old 14.6
grunge 14.5
man 14.1
urban 13.1
group 12.9
adult 12.4
person 12.4
vintage 11.7
crowd 11.5
art 11.4
travel 11.3
business 10.9
balcony 10.4
men 10.3
lifestyle 10.1
musical instrument 10.1
male 9.9
history 9.8
structure 9.7
style 9.6
design 9.6
world 9.1
dirty 9
women 8.7
historical 8.5
modern 8.4
relaxation 8.4
percussion instrument 8.3
tourism 8.2
tourist 8.2
transportation 8.1
graphic 8
night 8
sport 7.8
scene 7.8
motion 7.7
move 7.7
sky 7.6
texture 7.6
grungy 7.6
monument 7.5
outdoors 7.5
famous 7.4
water 7.3
window 7.3
music 7.2
life 7.1
working 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97
person 95.7
man 76.7
clothing 75.7
people 58.9
group 57.8
black and white 55.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 21-29
Gender Female, 55%
Calm 88.1%
Sad 8.4%
Fear 1.6%
Angry 0.5%
Confused 0.5%
Disgusted 0.4%
Happy 0.3%
Surprised 0.2%

AWS Rekognition

Age 19-27
Gender Female, 62.6%
Calm 34.1%
Angry 28.7%
Sad 14.2%
Disgusted 11.2%
Confused 4.7%
Happy 4%
Fear 1.8%
Surprised 1.3%

AWS Rekognition

Age 22-30
Gender Female, 74.9%
Sad 76.8%
Calm 16.7%
Fear 2%
Disgusted 1.3%
Happy 1%
Angry 0.9%
Confused 0.9%
Surprised 0.5%

AWS Rekognition

Age 26-36
Gender Female, 88.4%
Calm 57%
Happy 22.1%
Sad 6.8%
Fear 3.9%
Disgusted 3.8%
Confused 2.6%
Angry 2.1%
Surprised 1.8%

AWS Rekognition

Age 23-31
Gender Female, 58.6%
Calm 92.3%
Sad 4.8%
Happy 1.2%
Confused 1.2%
Disgusted 0.2%
Angry 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 23-33
Gender Female, 55.9%
Calm 38.4%
Confused 23.2%
Sad 14.9%
Happy 14.5%
Fear 4.3%
Disgusted 2.2%
Surprised 1.4%
Angry 1.1%

AWS Rekognition

Age 26-36
Gender Female, 76.7%
Sad 65.1%
Fear 13.4%
Calm 6.3%
Happy 5.4%
Angry 4%
Disgusted 3%
Confused 1.9%
Surprised 0.9%

AWS Rekognition

Age 19-27
Gender Female, 82.5%
Calm 66%
Sad 17.5%
Disgusted 4.8%
Happy 4.2%
Fear 3.5%
Angry 1.8%
Confused 1.2%
Surprised 0.9%

AWS Rekognition

Age 33-41
Gender Male, 94.1%
Sad 76.7%
Calm 13.1%
Surprised 2.5%
Fear 2.3%
Disgusted 1.5%
Happy 1.5%
Angry 1.4%
Confused 1.1%

AWS Rekognition

Age 48-54
Gender Male, 77.2%
Calm 88.8%
Sad 4.8%
Confused 2.9%
Disgusted 1.1%
Surprised 0.7%
Fear 0.6%
Happy 0.6%
Angry 0.4%

Feature analysis

Amazon

Person 99.1%
Chair 82.8%

Categories

Text analysis

Amazon

7618A
7618A...ru