Human Generated Data

Title

Untitled (seated woman crocheting)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7756

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (seated woman crocheting)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 96.8
Human 96.8
Clothing 96.8
Apparel 96.8
Chair 96.6
Furniture 96.6
Home Decor 84
Face 74.4
Cushion 72.7
Portrait 67.9
Photography 67.9
Photo 67.9
Female 64
Hair 62.8
Sitting 62.4
Couch 60.6
Sleeve 59
Door 58.8
Bed 55.7

Imagga
created on 2022-01-09

man 27.5
male 26.9
person 26.2
people 25.6
adult 21.6
coat 17.7
business 17.6
professional 17
smiling 16.6
happy 15.7
attractive 15.4
work 14.9
corporate 14.6
smile 14.2
men 13.7
home 13.6
portrait 12.9
worker 12.3
couple 12.2
office 12
looking 12
clothing 11.9
lifestyle 11.6
black 11.5
working 11.5
indoors 11.4
brunette 11.3
pretty 11.2
holding 10.7
businessman 10.6
fashion 10.5
scholar 10.3
happiness 10.2
lab coat 10.2
laptop 10.2
newspaper 10.1
student 10
dress 9.9
handsome 9.8
medical 9.7
education 9.5
executive 9.5
suit 9.3
face 9.2
house 9.2
cheerful 8.9
computer 8.9
child 8.8
product 8.8
sitting 8.6
device 8.6
successful 8.2
one 8.2
intellectual 8.2
businesswoman 8.2
family 8
job 8
hair 7.9
cute 7.9
love 7.9
life 7.9
urban 7.9
standing 7.8
casual 7.6
two 7.6
building 7.6
blond 7.5
city 7.5
study 7.5
garment 7.5
teamwork 7.4
kitchen 7.4
success 7.2
color 7.2
team 7.2
kid 7.1
interior 7.1
creation 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.7
person 98.7
black and white 95.7
window 81.2
clothing 76.6
glasses 67.3
monochrome 52.9
human face 50.3

Face analysis

Amazon

Google

AWS Rekognition

Age 50-58
Gender Female, 66%
Happy 91.8%
Sad 3.7%
Angry 1.4%
Disgusted 1%
Confused 0.8%
Surprised 0.5%
Fear 0.4%
Calm 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%

Captions

Microsoft

a person sitting in front of a window 45.5%
a person standing in front of a window 45.4%
a person sitting at a table in front of a window 45.3%

Text analysis

Amazon

24546.
YE3AD

Google

245-46.
245-46.