Human Generated Data

Title

Untitled (man and woman performing with paino player)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7016

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man and woman performing with paino player)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.8
Person 99.8
Person 99.3
Chair 99.2
Furniture 99.2
Clothing 99
Apparel 99
Footwear 97.4
Shoe 97.4
Hat 95.7
Person 95.3
Shoe 93.4
Shoe 91.6
Shoe 89.3
Shoe 83.7
Text 78.9
Shorts 76.1
Sleeve 75.5
Female 66.5
Face 66.2
Word 63.9
Photography 62.1
Photo 62.1
Portrait 62.1
Woman 57
Home Decor 56.2
Pants 56

Imagga
created on 2021-12-15

people 32.9
person 28.4
adult 25.7
man 20.9
clothing 20.2
portrait 20.1
male 19.9
fashion 19.6
bathing cap 18.9
cap 17.6
human 17.2
smile 17.1
happy 16.3
attractive 16.1
smiling 15.9
model 15.6
body 15.2
pretty 14.7
women 14.2
business 14
face 13.5
headdress 13.4
sexy 12.8
lady 12.2
standing 12.2
men 12
art 11.9
happiness 11.8
businessman 11.5
looking 11.2
group 10.5
professional 10.3
casual 10.2
suit 10
dress 9.9
silhouette 9.9
family 9.8
health 9.7
style 9.6
sculpture 9.2
studio 9.1
businesswoman 9.1
black 9
handsome 8.9
life 8.7
love 8.7
lifestyle 8.7
cute 8.6
fashionable 8.5
walking 8.5
clothes 8.4
worker 8.3
one 8.2
girls 8.2
pose 8.2
posing 8
sibling 7.9
medical 7.9
look 7.9
vertical 7.9
work 7.8
couple 7.8
brunette 7.8
old 7.7
fresh 7.2
bright 7.1
hair 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.3
posing 93.3
clothing 91.8
person 85.2
standing 83.9
human face 59.5
footwear 54.5

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 71%
Surprised 60.4%
Fear 19.1%
Calm 16%
Sad 1.6%
Confused 1.1%
Happy 0.7%
Angry 0.6%
Disgusted 0.4%

AWS Rekognition

Age 21-33
Gender Male, 89.3%
Calm 60.4%
Surprised 24.6%
Happy 8.7%
Confused 3.4%
Sad 1.1%
Disgusted 1.1%
Angry 0.5%
Fear 0.2%

AWS Rekognition

Age 32-48
Gender Female, 59.2%
Happy 82.1%
Calm 13.6%
Sad 1.7%
Confused 0.7%
Surprised 0.7%
Angry 0.7%
Fear 0.5%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Chair 99.2%
Shoe 97.4%
Hat 95.7%

Captions

Microsoft

a group of people posing for a photo 93.2%
a group of people posing for the camera 93.1%
a group of people posing for a picture 93%

Text analysis

Amazon

ME
TAKE ME
TAKE
HOME
2.98
GAMBLING
22407
NO
SMOKING
es
KODVR-EVERLA

Google

NO snOKING GAMBLING TAKE ME HOME $2.98 VEE. KODVR 22407
NO
TAKE
VEE.
snOKING
$2.98
KODVR
22407
GAMBLING
ME
HOME