Human Generated Data

Title

Untitled (family posing for picture, Florasota Gardens)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8731

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family posing for picture, Florasota Gardens)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Clothing 99.9
Apparel 99.9
Human 99.7
Person 99.7
Person 99.6
Person 99.4
Female 99
Shorts 98.9
Person 96.5
Skirt 96.3
Woman 94
Dress 79.8
Person 73.5
Girl 72.8
Advertisement 70.9
Poster 70.9
People 70.6
Face 63.4
Photo 62.9
Photography 62.9
Portrait 62.9
Plant 60.4

Imagga
created on 2022-01-09

clothing 27.9
person 26.5
fashion 24.1
people 21.8
sexy 21.7
sport 21.7
adult 20.5
posing 20.4
black 20
attractive 19.6
pretty 18.9
portrait 18.8
model 18.7
style 18.5
exercise 17.2
pose 17.2
body 16.8
dress 16.3
fitness 16.3
lifestyle 15.9
garment 15.3
man 14.8
lady 13.8
dancer 13.7
gorgeous 13.6
hair 13.5
women 13.4
legs 13.2
human 12.7
covering 12.7
cute 12.2
brunette 12.2
face 12.1
male 12.1
performer 11.8
elegance 11.8
modern 11.2
motion 11.1
action 11.1
health 11.1
active 11.1
sensual 10.9
sensuality 10.9
stylish 10.8
leg 10.7
standing 10.4
cool 9.8
dance 9.8
athlete 9.6
cap 9.6
bathing cap 9.6
strength 9.4
consumer goods 9.3
slim 9.2
dark 9.2
makeup 9.2
city 9.1
studio 9.1
silhouette 9.1
healthy 8.8
swimsuit 8.8
look 8.8
planner 8.8
men 8.6
leisure 8.3
street 8.3
newspaper 8.3
one 8.2
art 7.8
nude 7.8
glamorous 7.7
gym 7.7
maillot 7.6
equipment 7.6
clothes 7.5
fit 7.4
teenager 7.3
looking 7.2

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 96.2
person 95.8
footwear 93.1
clothing 91.4
outdoor 90.6
woman 61.7
smile 58.3

Face analysis

Amazon

Google

AWS Rekognition

Age 45-53
Gender Male, 99.4%
Calm 70.8%
Happy 22%
Surprised 3.3%
Disgusted 2.4%
Confused 0.7%
Fear 0.4%
Angry 0.2%
Sad 0.2%

AWS Rekognition

Age 39-47
Gender Male, 93.3%
Happy 99.3%
Calm 0.3%
Fear 0.1%
Surprised 0.1%
Disgusted 0.1%
Confused 0.1%
Angry 0.1%
Sad 0.1%

AWS Rekognition

Age 35-43
Gender Male, 99%
Happy 95.5%
Calm 2.8%
Surprised 0.6%
Sad 0.3%
Angry 0.3%
Disgusted 0.2%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 45-53
Gender Female, 97.1%
Happy 97.5%
Calm 1.1%
Sad 0.6%
Angry 0.3%
Confused 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people posing for a picture 83.7%
a group of people posing for the camera 81.2%
a group of people posing for a photo 75.8%

Text analysis

Amazon

38332
BED

Google

BED YT37A°2-AGO
BED
YT37A°2-AGO