Human Generated Data

Title

Untitled (three spectators watching a man doing a handstand with the help of another man)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4866

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (three spectators watching a man doing a handstand with the help of another man)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.4
Person 99
Person 98.4
Clothing 82.3
Apparel 82.3
Female 78.6
People 76.4
Leisure Activities 72.1
Face 70.4
Furniture 67
Girl 66
Photo 65.9
Photography 65.9
Portrait 65.9
Person 64.5
Table Lamp 63.6
Lamp 63.6
Dance 63.4
Woman 58.6
Indoors 58.5
Room 58.5
Living Room 57.9
Home Decor 57.2
Bed 56.5
Bedroom 55.5
Kid 55.2
Child 55.2

Imagga
created on 2022-01-23

sax 35.1
man 22.8
people 19
person 17
oboe 16.3
wind instrument 16.2
male 15.8
adult 15.3
portrait 14.2
dress 12.6
lifestyle 12.3
happy 11.9
model 11.7
life 11.4
men 11.2
black 10.8
posing 10.7
fashion 10.5
human 10.5
couple 10.4
sexy 10.4
performer 10.4
sensuality 10
leisure 10
silhouette 9.9
bride 9.6
drawing 9.5
love 9.5
light 9.4
outdoor 9.2
sketch 9.1
sky 8.9
style 8.9
clothing 8.7
day 8.6
play 8.6
cute 8.6
musician 8.5
modern 8.4
attractive 8.4
negative 8.3
wedding 8.3
lady 8.1
activity 8.1
body 8
hair 7.9
women 7.9
standing 7.8
sport 7.8
stage 7.7
wall 7.7
grunge 7.7
old 7.7
film 7.6
musical instrument 7.6
joy 7.5
woodwind 7.5
city 7.5
outdoors 7.5
holding 7.4
art 7.3
music 7.3
smile 7.1
bassoon 7.1
interior 7.1
summer 7.1
happiness 7
singer 7

Microsoft
created on 2022-01-23

text 92.5
clothing 89.3
person 88.9
outdoor 88.9

Face analysis

Amazon

AWS Rekognition

Age 29-39
Gender Female, 97.1%
Happy 56.2%
Calm 28%
Surprised 10.7%
Angry 1.7%
Sad 1.4%
Fear 1.1%
Disgusted 0.6%
Confused 0.3%

AWS Rekognition

Age 21-29
Gender Male, 91.3%
Happy 58.5%
Calm 34.6%
Surprised 3.4%
Sad 1.1%
Disgusted 1.1%
Confused 0.8%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 48-56
Gender Female, 99.2%
Sad 68.4%
Calm 24%
Confused 2.2%
Happy 1.7%
Angry 1.4%
Disgusted 0.9%
Surprised 0.9%
Fear 0.5%

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people around each other 59%
a group of people on a stage 57.5%
a group of people jumping in the air 46.6%

Text analysis

Amazon

3014
-HIOE
H-30-31
H-30-31 -2014-
-2014-
I to
roloma I to وسلم retased -HIOE
retased
وسلم
roloma

Google

२० -२००३ 301d -२०-
301d
२०
-२००३
-२०-