Human Generated Data

Title

Untitled (woman doing a split between two chairs)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5685

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman doing a split between two chairs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Furniture 100
Chair 100
Apparel 99.3
Clothing 99.3
Person 99
Human 99
Chair 98.3
Face 89
Female 86.2
Dress 81.5
Plant 79
Tree 79
Outdoors 78.4
Table 77
Dining Table 76.4
Photography 73.9
Portrait 73.9
Photo 73.9
Woman 72.9
Pants 71.6
Drawing 71.5
Art 71.5
Sleeve 70.8
Shorts 69.7
Leisure Activities 66.9
Text 64.6
Long Sleeve 64.4
Nature 62.5
Dance Pose 60
Footwear 57.8
Shoe 57.8
Flooring 56.1
Robe 55.6
Fashion 55.6
Shirt 55.5
Floor 55.1

Imagga
created on 2021-12-15

sketch 100
drawing 85.7
representation 67.7
people 22.3
person 19.7
portrait 17.5
attractive 16.8
man 16.2
fashion 15.8
adult 15.6
active 15.3
outdoor 14.5
posing 14.2
model 14
snow 13.9
fitness 12.7
sport 12.5
human 12
pose 11.8
happy 11.3
pretty 11.2
action 11.1
elegance 10.9
exercise 10.9
lifestyle 10.8
dress 10.8
black 10.8
lady 10.5
outdoors 10.5
body 10.4
style 10.4
hair 10.3
winter 10.2
happiness 10.2
cute 10
alone 10
sax 10
leisure 10
male 9.9
jump 9.6
women 9.5
dance 9.5
men 9.4
wall 9.4
grunge 9.4
silhouette 9.1
old 9.1
fun 9
couple 8.7
forest 8.7
casual 8.5
modern 8.4
dancer 8.4
summer 8.4
joy 8.4
sensuality 8.2
sexy 8
motion 7.7
performer 7.7
leg 7.7
energy 7.6
power 7.6
one 7.5
park 7.4
success 7.2
sunset 7.2
activity 7.2
smile 7.1
swing 7.1
cool 7.1
businessman 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

sketch 97.1
drawing 97
text 96.7
outdoor 96.5
clothing 65.1
person 60.4
woman 54.1
cartoon 53.3
black and white 52.1

Face analysis

Amazon

Google

AWS Rekognition

Age 21-33
Gender Female, 61.2%
Happy 73.6%
Calm 23.1%
Sad 1.2%
Surprised 0.7%
Angry 0.5%
Fear 0.5%
Confused 0.2%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 100%
Person 99%

Captions

Microsoft

a man that is standing in the snow 45.8%
a man jumping in the air 44.4%

Text analysis

Amazon

5
13934.

Google

13934. ।3१34.
।3१34.
13934.